Researchers compile a list of over 1,000 WORDS that inadvertently activate Alexa, Siri, and Google Assistant - including 'tobacco,' 'Montana,' 'election' and 'a city'

Researchers in Germany have compiled a list of more than 1,000 words that will inadvertently cause virtual assistants like Amazon's Alexa and Apple's Siri to become activated.
Once activated, these virtual assistants create sound recordings that are later transmitted to platform holders, where they may be transcribed for quality assurance purposes or other analysis.
According to the team, from Ruhr-Universität Bochum and the Max Planck Institute for Cyber Security and Privacy in Germany, this has 'alarming' implications for user privacy and likely means short recordings of personal conversations could periodically end up in the hands of Amazon, Apple, Google, or Microsoft workers.
Researchers in Germany tested virtual assistants like Amazon's Alexa, Apple's Siri, Google Assistant, and Microsoft's Cortana, and found more than 1,000 words or phrases that would inadvertently activate each device
Researchers in Germany tested virtual assistants like Amazon's Alexa, Apple's Siri, Google Assistant, and Microsoft's Cortana, and found more than 1,000 words or phrases that would inadvertently activate each device
The group tested Amazon's Alexa, Apple's Siri, Google Assistant, Microsoft Cortana, as well as three virtual assistants exclusive to the Chinese market, from Xiaomi, Baidu, and Tencent, according to a report from the Ruhr-Universität Bochum news blog.
They left each virtual assistant alone in a room with a television that played dozens of hours of episodes from Game of Thrones, Modern Family, and House of Cards, with English, German, and Chinese audio tracks for each.
When activated, an LED light display on each device turns on, and the team cross referenced the dialogue being spoken every time they observed the LED display turning on.
In all they cataloged more than 1,000 words and phrases that produced inadvertent activations.
According to  researcher Dorothea Kolossa, it's likely the virtual assistant designers purposefully chose to make them more sensitive so as to avoid frustrating users. 
'The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans,' researcher Dorothea Kolossa said.
'Therefore, they are more likely to start up once too often rather than not at all.'
Apple's Siri is supposed to be activated by saying 'Hey Siri,' but the team found it would also regularly be turned on by 'a city' and 'Hey Jerry'
Apple's Siri is supposed to be activated by saying 'Hey Siri,' but the team found it would also regularly be turned on by 'a city' and 'Hey Jerry'
The team left each device in a room with episodes of television shows like Game of Thrones, House of Cards, and Family Guy running for dozens of hours to test which words or phrases produced inadvertent activations
The team left each device in a room with episodes of television shows like Game of Thrones, House of Cards, and Family Guy running for dozens of hours to test which words or phrases produced inadvertent activations 
Once activated, the devices will use local speech analysis software to determine if the sound was intended to be an activation word or phrase. 
If the device attaches a high likelihood that the sound was intended as a trigger, it will send an audio recording of several seconds to cloud servers for additional analysis.
'From a privacy point of view, this is of course alarming, because sometimes very private conversations can end up with strangers,' says Thorsten Holz.
'From an engineering point of view, however, this approach is quite understandable, because the systems can only be improved using such data.'
'The manufacturers have to strike a balance between data protection and technical optimization.' 
In May, a former Apple contractor said that the company was capturing small portions of private conversations through Siri interface, which included medical information, criminal activity, business meetings, and even sex. 
The whistleblower, Thomas le Bonniec, had worked for Apple in a Cork, Ireland office listening to countless short audio recordings until he resigned in 2019.
'They do operate on a moral and legal grey area, and they have been doing this for years on a massive scale,' he told The Guardian. 'They should be called out in every possible way.' 
Researchers compile a list of over 1,000 WORDS that inadvertently activate Alexa, Siri, and Google Assistant - including 'tobacco,' 'Montana,' 'election' and 'a city' Researchers compile a list of over 1,000 WORDS that inadvertently activate Alexa, Siri, and Google Assistant - including 'tobacco,' 'Montana,' 'election' and 'a city' Reviewed by Your Destination on July 02, 2020 Rating: 5

No comments

TOP-LEFT ADS