Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Smart Spies attack: Alexa and Google Assistant can eavesdrop or vish (voice phish) unsuspecting users, disclose researchers from SRLabs

Save for later
  • 3 min read
  • 22 Oct 2019

article-image

In a new study security researchers from SRLabs have exposed a serious vulnerability - Smart Spies attack in smart speakers from Amazon and Google. According to SRLabs, smart speaker voice apps - Skills for Alexa and Actions on Google Home can be abused to eavesdrop on users or vish (voice-phish) their passwords.

The researchers demonstrated that with Smart Spies attack they can get these smart speakers to silently record users or ask their Google account passwords by simply uploading a malicious software disguised as Alexa skill or Google action.

The SRLabs team added "�. " (U+D801, dot, space) character sequence to various locations inside the backend of a normal Alexa/Google Home app. They tell a user that an app has failed, insert the "�. " to induce a long pause, and then prompt the user with the phishing message after a few minutes. This tricks users into believing the phishing message has nothing to do with the previous app with which they interacted.

Using this sequence, the voice assistants kept on listening for much longer than usual for further commands. Anything the user says is then automatically transcribed and can be sent directly to the hacker.



This revelation of Smart Spies attack is unsurprising considering Alexa and Google Home were found phishing and eavesdropping before. In June of this year, two lawsuits were filed in Seattle that allege that Amazon is recording voiceprints of children using its Alexa devices without their consent. Later, Amazon employees were found listening to Echo audio recordings, followed by Google’s language experts doing the same.

SRLabs researchers urge users to be more aware of Smart Spies attack and the potential of malicious voice apps that abuse their smart speakers. They caution users to be more aware of third-party app sources while installing a new voice app on their speakers.

Measures suggested to Google and Amazon to avoid Smart Spies attack

  • Amazon and Google need to implement better protection, starting with a more thorough review process of third-party Skills and Actions made available in their voice app stores.
  • The voice app review needs to check explicitly for copies of built-in intents.
  • Unlock access to the largest independent learning library in Tech for FREE!
    Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
    Renews at $19.99/month. Cancel anytime
  • Unpronounceable characters like “�. “ and silent SSML messages should be removed to prevent arbitrary long pauses in the speakers’ output.
  • Suspicious output texts including “password“ deserve particular attention or should be disallowed completely.


In a statement provided to Ars Technica, Amazon said it has put new mitigations in place to prevent and detect skills from being able to do this kind of thing in the future. It said that it takes down skills whenever this kind of behavior is identified. Google also told Ars Technica that it has review processes to detect this kind of behavior, and has removed the actions created by the security researchers. The company is conducting an internal review of all third-party actions, and has temporarily disabled some actions while this is taking place.

On Twitter people condemned Google and Amazon and cautioned others not to buy their smart speakers.

https://twitter.com/ClaudeRdCardiff/status/1186577801459187712

https://twitter.com/Jake_Hanrahan/status/1186082128095825920

For more information, read the blog post on Smart Spies attack by SRLabs.

Google’s language experts are listening to some recordings from its AI assistant

Amazon’s partnership with NHS to make Alexa offer medical advice raises privacy concerns and public backlash

Amazon is being sued for recording children’s voices through Alexa without consent