Researchers, including a person who was responsible for discovering the massive Meltdown and Spectre vulnerabilities that chipmakers and computer makers hurried to patch in early 2018, have discovered a huge security issue that can affect practically every smart speaker from Amazon, Google, or Apple. Apparently, you can use laser beams to target microphones the speakers, which interpret the signal as originating from voice commands. The hackers were able to perform all sorts of actions with the help of these smart devices, and there’s no real fix for the time being. All you can do is make sure your speaker isn’t facing a window, that it can’t access sensitive data or devices. Then again, it’s not like there’s any proof that anyone has been abusing the unexpected flaw. And actually making it work requires plenty of work.
Researchers in Japan and at the University of Michigan have been studying the issue for seven months, The New York Times explains. They’ve been able to open a garage door by hitting a voice assistant with a laser beam, and they were able to control a Google Home device on the fourth floor of a building from 230 feet away from the top of a different building.
This proves that an attacker could attempt to open smart doors, smart cars, and access anything a Google Home, Amazon Echo, or Apple HomePod would have access to. The Times report only mentions products from Amazon, Apple, and Google, but other 2019 smart speakers, and earlier models, are susceptible to the same hack. Amazon and Google, of course, are the more popular ones. The study also lists other products that can take voice commands, including the Facebook Portal Mini, Fire Cube TV, EchoBee 4, iPhone XR, iPad 6th gen, Galaxy S9, and Google Pixel 2.
The microphones in these devices feature a small diaphragm that moves when sound hits it. But it also moves when light from a laser or a flashlight reaches it. The computer coverts everything into electric signals, and that’s why the speaker can respond to light.
The only way the issue can be fixed is by adopting a different microphone design for future smart speaker models. All current ones will be susceptible to the hack, the study seems to suggest.
Companies including Amazon, Apple, Google, Ford, and Tesla have been notified and are analyzing the conclusions of the study.
Just because this huge and unexpected flaw exists in smart speakers doesn’t mean anyone can take advantage of it. Say you can get all the equipment in the image above, you still need to be able to translate voice commands into laser pulses so that control of smart speakers is possible. Just shooting a beam of light at a device’s speakers won’t give the person with the laser full and instant control over a smart home’s devices. If anybody has all that sophisticated tech on hand, complete with experience on how to use it, and is targeting your home, well, you’ve got bigger things to worry about.
In the meantime, you’ll want to use PINs to protect your sensitive information, and maybe turn off the microphones every time you leave home. Attempting to hack phones and tablets from the distance, using laser beams, seems the more dangerous side-effect of the hack, and the kind of attack you’d see in spy movies. However, it’s probably more difficult to pull it off, than with speakers.
Still, it’s likely smart speaker manufacturers will take measures to prevent such hacks on future machines.