Smart devices can allegedly be tricked into following voice commands by laser pointers, new research demonstrates. The tool can make smart speakers, smartphones, and tablets perform numerous tasks even from hundreds of feet distance.
Scientists from Tokyo and the University of Michigan have demonstrated that they could control Google Assistant, Apple Siri, and Amazon Alexa devices by pointing lasers or flashlights at their microphones.
The researchers have written a paper which details the light issue after seven months of testings. They could take over smart speakers 230 to 350 feet away by shining laser pointers utilizing a telephoto lens. As a matter of fact, the Google Home they made open a garage door was inside a room in a different building. The laser variation they shined at the device’s microphone port through the window is the same as the voice command “OK Google, open the garage door.”
Not the First Vulnerability of this Kind that was Discovered
The team said that there is a tiny fragment inside a device’s microphone, called a diaphragm. This diaphragm moves when light is pointed at it. Lasers can, therefore, duplicate that specific behavior and transform it into electric signals that the device comprehends.
Opening the garage door by hijacking Google Home was a simple thing to do, the researchers explained. The amount of things one can do by using this method includes online purchases, open doors secured by smart locks, and even unlock cars connected to AI-running devices from a distance.
The scientists have already informed Tesla, Ford, Apple, Google, and Amazon with regards to the problem. Now, the majority of microphones would have to be recreated, as simply covering them with tape is not working.
The devices vulnerable to this type of hijack are Google Home and Google Nest, Echo Plus, Echo Show, Echo Dot, Facebook Portal Mini, EchoBee 4, Fire Cube TV, iPhone XR, iPad 6th Generation, Samsung Galaxy S9, and Google Pixel 2.
This is not the first weakness security researchers have found. Scientists from China’s Zheijiang University discovered that Siri, Alexa, and other smart assistants could be commanded with instructions sent in ultrasonic frequencies.
Also, a group from the University of California, Berkeley, discovered that they could hijack smart speakers by inserting commands inaudible to the human ear, straight into music or text recordings.