When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Alexa, Google Assistant, Siri, and others vulnerable to 'silent' commands

Google, Amazon, Microsoft, Apple, Samsung, and Huawei have been competing to develop the most robust and capable voice assistant over the past few years. Unfortunately, they all have now been found to be affected by a vulnerability that lets attackers 'silently' control them – on almost every device.

A team of researchers at China’s Zhejiang University discovered the vulnerability; dubbing the technique as ‘DolphinAttack’, researchers have managed to modulate the frequency of a voice command made by a human, into ultrasonic frequencies. This makes the sound inaudible to human ears, but audible to the microphones present on several consumer devices such as iPads, iPhones, MacBooks, Apple Watches, Amazon’s Echo, a Lenovo ThinkPad T440p with Windows 10, and even an Audi Q3.

In experiments with the technique, the researchers managed to successfully dupe Google’s Assistant, Apple’s Siri, Amazon’s Alexa, Samsung’s now-defunct S Voice, Microsoft’s Cortana, and Huawei’s HiVoice. It’s not clear why Samsung’s newer Bixby voice assistant was not tested, but it’s perhaps because of how recent its launch was.

The researchers managed to not only silently activate the voice assistants, but also perform commands such as making a call, opening a website, turning on airplane mode, and unlocking the back door to a house fitted with a smart lock– the last one should be of most concern.

As Internet-connected locks, lights, home appliances, and other devices become more prominent, along with devices such as the Amazon Echo and Google Home being pushed as the most convenient approach to control a smart home, the security risks caused by such vulnerabilities increases quite a bit, as was terrifyingly displayed in the second season of the show ‘Mr. Robot’. It should be noted, however, that Google Home was not tested by the researchers.

All that's needed to execute a 'DolphinAttack'
All that's needed to execute a 'DolphinAttack'

On an even more concerning note, playing back audio at these ultrasonic frequencies requires equipment that costs a total of just $3, minus the cost of a smartphone; anyone with the technical knowledge can, therefore, make use of the vulnerability.

However, there is a saving grace to all of this: for the ultrasonic audio to be picked by these voice assistants, the attacker must be within five to six feet of the target device. Additionally, in the case of most smartphones, triggering a voice assistant by calling its name only works if the device is unlocked.

In order to stop devices from responding to audio at ultrasonic frequencies, the voice assistants would have to begin ignoring commands at 20KHz and other frequencies in which humans cannot speak or hear. Unfortunately, doing so might also reduce the accuracy and alertness of these voice assistants.

Of course, the other way of improving your security against such an attack is to disable activation on trigger word within the voice assistants, but that solution certainly defeats the purpose of these assistants – if they are not listening for our commands, what are they even here for? Also, let's not forget, machines are trying to mimic your voice as well.

Source: DolphinAttack research paper via Engadget, Fast Co.

Report a problem with article
Next Article

Facebook bans 470 "fake" Russian accounts and pages

Previous Article

Debian 10 development builds switch to Wayland

Join the conversation!

Login or Sign Up to read and post a comment.

18 Comments - Add comment