Written by Amer Owaida, Security Writer at ESET
If Siri, Google Assistant or any other voice assistant are part of your daily routine, you may be unnerved to find out that attackers, too, could activate it – all the while you wouldn’t hear a thing. A group of US and Chinese researchers conducted a number of experiments, proving that under the right conditions the voice assistants on your smartphone could be fooled into spilling sensitive information or carrying out certain tasks.
According to the paper, the researchers tested 17 popular smartphones. All but two devices turned out to be susceptible to the attack, called SurfingAttack, which uses ultrasonic guided waves to elicit a reaction from the voice assistants. Although this research isn’t the first to demonstrate inaudible attacks in action, its predecessors such as DolphinAttack or LipRead focused on over-the-air transmission and one-way interaction.
SurfingAttack, on the other hand, works over a solid medium and allows multi-round interactions with the device, since smartphone voice assistants ask questions to specify their task and require answers to perform them. The researchers used four different representative types of tables made from glass, metal, wood, and plastic through which they tried to conduct the ultrasonic attacks. The attack was transmitted through a piezoelectric disc that was attached under the table, while the targeted smartphone was positioned on the table.
The whole exchange was also recorded by a hidden microphone to emulate how the whole attack would take place and how the bad actors would obtain the data. To illustrate what the attackers would be able to access, the research team took selfies, read out SMS messages and even performed fraudulent calls using the phones. Access to text messages and calls is especially worrying.
For example, if you secure your accounts with two-factor authentication (2FA) and use SMS messages to receive your authentication code, this attack could allow hackers to bypass that extra security layer and grant them access to your online services of choice. The ne’er-do-wells could also make your voice assistant dial numbers that forward your call abroad or even to a collect number, racking up obscene charges in the process.
The study concludes with countermeasures to mitigate the threat that is also summed up here.