Researchers take control of Siri, Alexa, and Google Home with lasers

The newly discovered microphone vulnerability allows attackers to remotely inject inaudible and invisible commands into voice assistants using light.

Alexa lit up by Bell Tower laser Enlarge
Daniel Genkin, EECS Assistant Professor, and Benjamin Cyr, EECS PhD Student, setup a laser in the Lurie Bell Tower to hack into a Google Home device in the Bob and Betty Beyster Building on November 25, 2019. A team of EECS researchers led by Kevin Fu, Associate Professor, and Daniel Genkin, Assistant Professor, have discovered a microphone vulnerability that allows attackers to remotely inject inaudible and invisible commands into voice assistants such as Siri and Alexa using light. Photo: Joseph Xu/Michigan Engineering, Communications and Marketing

Of all the many complex devices we carry around or keep in the home, the ubiquitous virtual assistant seems the most straightforward to disrupt. Alexa, Google Assistant, Siri, and others like them respond to sounds in their environment – an errant spoken command is all it would take to provoke unwanted behavior.

But in a surprising twist, sound isn’t the most insidious vector for hacking your phone speaker or your Amazon Echo. A team of researchers at the University of Michigan and University of Electro-Communications, Tokyo have upped the ante and hijacked the assistants with light.

The newly discovered Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants using light. The effect was demonstrated by Profs. Kevin Fu, Daniel Genkin, and Takeshi Sugawara (University of Electro-Communications, Tokyo) as well as Dr. Sara Rampazzi and PhD student Benjamin Cyr. The team successfully used light to inject malicious commands into several of these voice-controlled devices, including smart speakers, tablets, and phones, from far away and through glass windows.

Risks associated with these attacks range from benign to frightening depending on how much a user has tied to their assistant. The researchers were able to unlock a victim’s smart-lock protected home doors, or even locate, unlock, and start various vehicles in their demonstrations. In short, once an attacker gains control over a voice assistant a number of other systems could be open to their manipulation.

In the worst cases, this could mean dangerous access to e-commerce accounts, credit cards, and even any connected medical devices the user has linked to their assistant.

EECS PhD Student, setup a laser in the Lurie Bell Tower to hack into a Google Home device Enlarge
Daniel Genkin, EECS Assistant Professor, and Benjamin Cyr, EECS PhD Student, setup a laser in the Lurie Bell Tower to hack into a Google Home device in the Bob and Betty Beyster Building on November 25, 2019. A team of EECS researchers led by Kevin Fu, Associate Professor, and Daniel Genkin, Assistant Professor, have discovered a microphone vulnerability that allows attackers to remotely inject inaudible and invisible commands into voice assistants such as Siri and Alexa using light. Photo: Joseph Xu/Michigan Engineering, Communications and Marketing

The discovery hinged on the fact that microphones convert sound into electrical signals. The group was able to elicit the same response in MEMS microphones with light aimed directly at them, essentially tricking the microphone into producing signals as if it were picking up audio. By modulating an electrical signal in the intensity of the light beam, they could dictate specific commands to the device from afar.

The range of these attacks is limited only by the intensity of an attacker’s laser and his line of sight. In one particularly compelling demonstration, the team used a laser from 75 meters away, at a 21° downward angle, and through a glass window to force a Google Home to open a garage door; they could make the device say what time it was from 110 meters away.

To highlight the ease of exploiting this weakness, the researchers aimed and focused their light commands with nothing more than a telescope, a telephoto lens, and a tripod. They benchmarked its effectiveness on 17 different devices hosting the range of most popular assistants.

The team’s work is presented in the paper “Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems,” with additional information on https://lightcommands.com/.

Read more on:

January 6, 2020 : SmarterEveryDay

Breaking Into a Smart Home With A Laser – Smarter Every Day 229

Graduate student Ben Cyr demonstrates how his lab was able to hack into smart speakers with a laser.

November 5, 2019 : NBC

The smart speaker in your home may not be as secure as you think

Researchers, including EECS-CSE associate professor Kevin Fu, have discovered an exploit that made home assistants vulnerable to lasers.

November 5, 2019 : Wired

Hackers can use lasers to ‘speak’ to your Amazon Echo or Google Home

Researchers, including EECS-CSE associate professor Kevin Fu, have discovered an exploit that makes home assistants vulnerable to laser attacks.

November 5, 2019 : The New York Times

With a laser, researchers say they can hack Alexa, Google Home or Siri

EECS-CSE associate professor Kevin Fu’s research has found a vulnerability in home assistants, reports the New York Times.