Ok cool, Montana, election: Alexa, Siri, Google, Cortana can be triggered with these 1,000 words | HT Tech

Ok cool, Montana, election: Alexa, Siri, Google, Cortana can be triggered with these 1,000 words

Amazon’s Alexa, Apple’s Siri, the Google Assistant and Microsoft’s Cortana can be triggered accidentally with a few words and researchers have claimed to have identified over 1,000 words that can do this for each assistant.

By: HT TECH
| Updated on: Jul 03 2020, 17:09 IST
This discovery adds more weight to the security concerns users have had with these smart speakers. These speakers are not supposed to be listening to any conversation unless they are triggered with the wake word.
This discovery adds more weight to the security concerns users have had with these smart speakers. These speakers are not supposed to be listening to any conversation unless they are triggered with the wake word. (Pixabay)
This discovery adds more weight to the security concerns users have had with these smart speakers. These speakers are not supposed to be listening to any conversation unless they are triggered with the wake word.
This discovery adds more weight to the security concerns users have had with these smart speakers. These speakers are not supposed to be listening to any conversation unless they are triggered with the wake word. (Pixabay)

Smart speakers are supposed to be activated only by their wake words. Like Hey Siri, Ok Google, Hey Alexa etc. However, smart speaker users would have often seen that these speakers get triggered without the use of these wake words at times.

Amazon's Alexa, Apple's Siri, the Google Assistant and Microsoft's Cortana can be triggered accidentally with a few words and researchers at the Ruhr-Universitat Bochum and the Max Planck Institute for Cyber Security and Privacy in Germany have claimed to have identified over 1,000 words that can do this for each assistant.

This discovery adds more weight to the security concerns users have had with these smart speakers. These speakers are not supposed to be listening to any conversation unless they are triggered with the wake word. But the fact that they can be ‘accidentally' triggered means that these smart speakers are listening to conversations even when they are not supposed to. This raises concerns that smart speakers could be listening in to private or sensitive conversations when we are not aware.

However, the researchers are of the opinion that this feature, of being accidentally triggered by some words is a deliberate engineering design that has been put in to make these smart speakers more responsive.

To track these words, the researchers placed these smart speakers in a room that played videos of TV shows like Modern Family, House of Cards, Game of Thrones etc. While these shows were on, the researchers took note of all the times the light on these speakers came on, indicating that they had been triggered and were listening.

Researchers then replayed the sequences that triggered the speakers to figure out which word specifically called them to action.

For Alexa, researchers found that words such as ‘election', ‘a letter', ‘unacceptable' triggered the speaker awake. In the case of Apple's Siri, it was words like ‘Hey Jerry' and ‘a city' - given the phonetics, these words sound pretty close to ‘Hey Siri' or ‘Siri'.

The Google Assistant was triggered by phrases like ‘Ok cool' and ‘Ok, who's reading' and Cortana also responded to being called Montana.

If one thinks about it, some of these words and phrases sound similar to the wake words that call these speakers to action. But, it still is concerning because accidentally triggered speakers can record conversations we are not aware of and possibly do not want recorded.

Security concerns regarding smart speakers was made worse when it was revealed that all the companies use third-party human contractors, besides their AI models, to go through audio recordings collected from these speakers. The companies said that this was for ‘quality monitoring purposes'. However, they all have now claimed that they no longer us humans to vet these recordings.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 03 Jul, 17:09 IST
NEXT ARTICLE BEGINS