Takeshi Sugawara, a cyber-security researcher, entered Kevin Fou’s office in the spring of last year, a professor at the UM. He needed a strange trick to show that he had found it. Sugawara points to his iPad’s microphone with a high-powered laser — all in a black metal box to stop anyone from being burned or blinded — and had Fu put on pairs of earbuds to listen to the iPad’s noise. As the laser frequency in the form of a sine wave changed over time and fluctuated about 1,000 times a second, Fu took on a distinctive tone. Inexplicably, the iPad microphone had a sound that transformed the laser light into an electrical signal.
Six months later, with Fu and a group of scientists at the University of Michigan, a trip to the Tokyo University of Electro-Communications turned this curious photoacoustic quirk into something much more worrying. Now they’re allowed to secretly “read” lasers on any machine that receives voice commands— including smartphones, Amazon Echo speakers, Google Homes, and Facebook Portal video chat apps. The spy trick helps you send “light orders” away for centuries; you can open garages, shop online, and trigger all kinds of misfortune or malice. If the owner of the machine fails to notice a simple flashing light spot or the target device answers, the attack can easily pass through the door.
“Microphones can react to light as if it were noise,” Sugawara says. “It is possible. In the months of the experiments that followed the initial findings of Sugawara, the investigators discovered that the light somehow disturbed the membrane of the microphone by pointing the laser to the microphone and by changing the intensity at a precision frequency. The positioning was not particularly accurate; in some cases the system was simply overflowing with light.
Use of Laser to Hack Amazon and Google Voice Control
Using 60 milliwatt laser, almost all smartphones recorded 164-inch commands, the maximum distance checked, from 16 smartphone devices, smartphones and other voice-activated devices to “speak” commands. Smartphones were more complicated: the iPhone could only be operated from around 33 feet and two Android phones from around 16 feet.
Driving down to a 5-milliwatt laser–comparable to a low-cost laser pointer–and pushing 361 feet away from the targets in the lobby, scientists tested the technique in a second experiment. While most of their tests failed in this field, they still found themselves in a position to control first generation Google Home and Echo Plus. In another experiment, a window over a google home microphone inside a nearby building was used to successfully move the laser controls.
How it works?
The “voice” commands on this laser beam would be completely silent, the researchers point out. The observer can tell that they saw a blue spot on their microphone. “The blocking of sound is not valid for your presumptions,” says a professor at Michigan University who is co-leading the group, Daniel Genkin. “This security issue can be viewed as a laser through a voice-activated device window.”
Researchers suggest that the Infrarot Laser, which could be invisible to the naked eye, can be used for even more secrecy by a voice aid hacker. And while voice assistants usually offer a hearing answer, the hacker may send an initial command that will turn the volume down to zero (they tested an infrared laser and found that it operated to monitor close range Echo and Google Home speakers, but did not try to scare anyone away for burning or blinding). Whereas investigators have not explicitly tested this, they also believe that a light intruder could use light commands to activate Amazon’s “whisper mode.”
Researchers had a surprising response when it came to the physics of a microphone that interprets light as sound. However, they refused to speculate on the causes of the light-as – a-speech effect of photoacoustic mechanics for scientific rigor.
What researchers say about this absurd theory?
Paul Horowitz, Professor Emeritus of Physics and Electrical Engineering at Harvard and co-author of Art of Electronics, said, but at least two different physical mechanisms can generate vibrations that make light orders possible. First of all, the microphone would be warmed by a laser pulse to raise the air around it and create vibration at the same pressure as the noise. Horowitz also maintains that the laser light can pass through the microphone and directly hit the electronic chip, interpreting its vibrations as electrical signals if the components in the target devices are not fully transparent. Horowitz notes that this could result in an equivalent photovoltaic effect that would transform light into electricity or an electrical signal in solar cell diodes and at the ends of fiber optic cables. He says that the laser can be quickly interpreted as a voice command.
Horowitz says, “There is no shortage of hypotheses, one or more of which are occurring here. Potential havoc covers everything from smart home systems to remotely unlocking cars such as door locks and thermostats.” It’s the same kind of risk as any voice system, but with an odd effect of range, “says Fu. In Sara Rampazzi’s words, working at the University of Michigan, ” Now, the question is how powerful your voice is and what it is that you’re linked to.” Google spokesperson said to WIRED, “This research paper has been thoroughly reviewed. It’s vital to keep our users secure and always look at ways to improve the security of our phones.” Amazonian spokesperson wrote, “We study this research and continue to work with researchers to learn more about their work.
How to protect hackers in this case?
Many applications offer encryption protection that can detect a laser handle attacker. iPhones and iPads allow users to demonstrate their identity before purchasing, say, via TouchID or FaceID. And for most smartphones, researchers agree that “wake words” that are used as a voice command must be spoken in the voice of the phone owner. Yet we remember that an intruder who gets or retrains these words— like “Hello Siri” or “OK Google”— would then be able to speak those words as a preface to his voice commands in the target user’s own voice.
However, smart speakers such as Echo and Google Home do not authenticate the speech. And no software update can fix it because of the physical nature of the flaw. However, researchers recommend some less than optimal modifications, such as allowing a spoken PIN number before the voice staff execute the most important commands. We recommend more modifications to the device designs to protect them from attack, such as adding light shielding around their microphone or receiving voice commands on the opposite side of the device from two separate microphones, which may be hard to hit at the same time as a laser.
Until such repairs or design changes occur, Michigan’s Genkin suggests a simple, if counter-intuitive, thumb rule to anyone concerned about the consequences of the attack: “Don’t place your opponent’s voice-activated machine in the vision,” he says. You can talk to it, too, if you can see your Echo or Google Home on the screen.