Light Commands – Hackers Can use Light to Inject Arbitrary voice-command and hack Alexa, Siri, and Other VA

Light Commands, a new attack that lets an attacker inject arbitrary audio signals into voice assistants by using light from a very long distance. Security researchers from the University of Electro-Communications & Michigan discovered the new class of the injection attack dubbed “Light Commands” a vulnerability in MEMS microphones that allow attackers to inject inaudible and invisible commands into voice assistants.

Read full article on GBHackers

 


Date:

Categorie(s):