Hidden voice commands could hijack smartphones
Borg-like voice commands from your phone could take control of nearby Android devices using a newly demonstrated voice recognition vulnerability.
Your phone could be sending voice commands to pwn other phones
Security researchers have discovered a way of hiding voice commands in online videos that could take control of smartphones and tablets.
In a paper, researchers described how a voice recognition feature, such as Google Now, Siri or Cortana can be abused. In a YouTube video, the researchers demonstrated a proof-of-concept attack against an Android smartphone.
The hidden voice commands, heavily disguised and sounding a lot like the Borg from Star Trek, are hard to comprehend for humans but can be easily recognised by a device's speech recognition system.
“Ok Google, Open XKCD.com,” the voice is heard saying before a nearby phone opens that URL. Other examples showed researchers being able to turn on a phone's airplane mode.
While the hack might not work every time, given lots of factors concerning the sound environment a device happens to be in at the time or whether the phone is in range of the commands, enough devices could be directed to a malware URL that could put a smartphone under a criminal's command.
The scientists said that people with very little technical knowhow could carry out the attack.
"Hidden voice commands can be constructed even with very little knowledge about the speech recognition system," the paper's authors said. "We provide a general attack procedure for generating commands that are likely to work with any modern voice recognition system."
The researchers said that these attacks can be mitigated in a number of ways.
“Passive defences that notify the user an action has been taken are easy to deploy and hard to stop but users may miss or ignore them. Active defences may challenge the user to verify it is the owner who issued the command but reduce the ease of use of the system. Finally, speech recognition may be augmented to detect the differences between real human speech and synthesised obfuscated speech,” the researchers said.
The team's findings are due to be presented in August at the USENIX Security Symposium in Austin, Texas.
Henry Hoggard, security consultant at MWR InfoSecurity, told SCMagazineUK.com that for this attack to be possible, users have to explicitly enable device-wide voice commands, “which means voice commands can be called when any application is open and even when the device is locked, instead of the default scenario, where it is only accessible from the Google Voice command widget on the device homepage”.
"Users can still use voice recognition functions without this setting enabled by pressing the Google Voice search button on their device's homepage or by saying ‘OK Google' on the device's homepage,” he said.
Hoggard added that by default, device-wide voice commands are disabled on Android (tested on Android 6.0.1), therefore devices will already be protected against this attack unless a user explicitly enables device-wide voice commands.
"The worst that can happen is when an attacker can redirect a user to a malicious web page. However there are easier ways for an attacker to trick a user into visiting a malicious web page, such as phishing,” he said.
Ryan Wilk, director at NuData Security, told SC that users should ensure that voice recognition is not set to ‘always on'.
“It would be wise to disable voice commands for very sensitive functions such as logging into bank accounts, and it's a good idea to review the voice detection in your settings. i.e.: Disable ‘Trusted Voice' to unlock your phone,” he said.
“According to the study, users can be alerted to some extent by enabling ‘beep', ‘buzz' and ‘light show' notifications telling you that voice commands have been accepted. While they say hackers may be able to mask these, enabling notifications may provide at least some early warning that all is not well if you pay attention to them. Setting up confirmation notifications of voice commands could also defend against these attacks, however, may impact the initial purpose of using voice recognition, to begin with.”