Graffiti on stop signs could trick driverless cars into driving dangerously

Driverless cars
An Uber driverless car Credit: AFP

Driverless cars could ignore road signs on the streets since they can be confused by simple vandalism, researchers have found.

While carmakers have been investigating ways to protect autonomous cars against hackers, more conventional attacks could confuse the vehicles into misreading road signs that would appear normal to ordinary drivers.

Placing stickers or posters over part or the whole of a road sign could be used to trick the smart car into ignoring stop signs, even if visually they appear the same to the human drivers.

Researchers at the University of Washington demonstrated how car hackers who had gained access to the visual recognition software within the vehicle could create simple alterations to road signs that would cause the car to misread them. 

The researchers said changes that trick learning algorithms, such as those used in driverless cars, can cause them "to misbehave in unexpected and potentially dangerous ways."

Road Sign
Stickers caused cars to read a stop sign as a speed limit sign Credit: University of Washington

In one example, graffiti stickers were added to a stop sign that read "Love / Hate", which caused it to be misread as speed limit sign reading 45. In another trick, the researchers printed out a right-hand turn sign that looked almost identical to a legitimate one, but subtle colour changes caused the sign to be misinterpreted as a stop sign.

"Both of our attack classes do not require special resources—only access to a colour printer and a camera," the researchers said. The team said they hoped the research could help build better defensive systems into autonomous vehicles.

Turn Right
Although the turn right sign appears very similar the computer misread it Credit: University of Washington

The dangers of such attacks could see cars driving straight through junctions or coming to a halt in the middle of the road. Some current cars are already equipped to read and detect signs, such as Tesla's Autopilot feature on its Model S electric cars, although the vehicles are not yet programmed to react to the signs.

Threats to self-driving cars that would normally not affect human drivers have proved tricky for researchers to counter. Engineers at Volvo have been trying for two years to teach their cars how to avoid kangaroo collisions, while a team at Waymo was forced to develop a pair of tiny windscreen wipers to clean bird droppings that masked the cars cameras and LIDAR systems.

This week the government announced new guidance to develop safer driverless cars. Transport minister Lord Callanan said: "We need to make sure that the designs of the vehicles in the first place are completely cyber secure so that people can't break into them, they can't steal them and more importantly they can't hack them to potentially cause accidents."

 

License this content