GhostStripe attack haunts self-driving cars by making them ignore road signs

Six boffins mostly hailing from Singapore-based universities have proven it’s possible to attack autonomous vehicles by exploiting the system’s reliance on camera-based computer vision and cause it to not recognize road signs. The attack system, dubbed GhostStripe [PDF], is undetectable to the human eye, but could be deadly to Tesla and Baidu Apollo users as it manipulates the type of sensors employed by both brands – complementary metal oxide semiconductor (CMOS) sensors.

Source: The Register

 


Date:

Categorie(s):

Tag(s):