Laser-hacking voice-assistants

Who knew? Microphones respond to light waves as well as sound waves. This allows hackers to attack voice-assistant technologies such as Siri, Alexa and Google Assistant, sometimes from a considerable distance. These so-called ‘light commands’ can be generated from cheap and readily available electronic components such as laser pointers, laser diode drivers and audio amplifiers. If you add in an optional telephoto lens, you can focus the laser for long-range attacks.

There’s a vulnerability in MEMS (micro-electro-mechanical systems) microphones that most voice-assistant devices use that means they respond to light as if it were sound waves. This short video explains how it happens:

What it means is that hackers can remotely inject inaudible and invisible commands into your voice assistant. Here they are telling Google to open the garage door through a glass window from an adjacent building 70 metres (230 feet) away:

And here they are asking the time – from 110 metres (360 feet) away.

The Light Commands website has the full details along with links to the five University of Michigan researchers’ original paper, Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems.

Tweet or share this:

2 comments

Leave a Reply

Your email address will not be published. Required fields are marked *