Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

Incorrect Alexa, Siri, Google Assistant, and Cortana Trigger Words Are Compromising Your Privacy

A study found more than 1,000 phrases that incorrectly wake voice assistant devices, including Alexa, Siri, Cortana, and Google Assistant.

Cameron Coward
4 years agoSecurity / Alexa Gadgets / Voice

Most people are aware that they’re sacrificing some privacy in favor of the convenience offered by many of the features available for their devices. You might, for example, allow Google to keep track of your location so that you can get a notification if there is a traffic jam along your route. But the voice assistants offered by Apple, Google, Amazon, Microsoft, and others are much more concerning, as they can potentially monitor everything you say. All of those companies deny that they do that, and claim they only listen after you say the respective wake word. But a recent study found more than 1,000 phrases that can incorrectly wake these devices.

If you’re using Google Assistant, either on your phone or a device like the Google Home, it is supposed to only listen to what you’re saying if you start with the trigger phrase “OK, Google.” But you’ve probably noticed that it will mishear you and wake up when you are just having a normal conversation with a friend. Even dialog in a TV show or movie can inadvertently trigger your voice assistant to start listening. Most of the time, that isn’t much more than an annoyance. But it presents a problem if you’re concerned about your privacy, especially with multiple instances of voice assistant recordings being subpoenaed by courts and even being used for more illicit purposes.

According to a study conducted by Lea Schönherr, Maximilian Golla, Jan Wiele, Thorsten Eisenhofer, Dorothea Kolossa, and Thorsten Holz, evidence was found that every commercial voice assistant that they tested can be awakened with incorrect trigger words and phrases. More than one-thousand of those phrases were identified with an automated process using professional audio datasets and media like TV shows and newscasts. For instance, the word “Montana” would wake Microsoft’s Cortana assistant and “Unacceptable” would wake Amazon’s Alexa. It seems unlikely that these words were explicitly programmed to act as triggers; it’s probably just because they sound somewhat similar. But the corporations that developed these voice assistants have little incentive to prevent accidental triggers, which is absolutely problematic.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles