Your browser may block some cookies by default. By clicking, you agree to allow our advertising partners to place their cookies and serve you more relevant ads. To view our privacy policy or opt-out, click here.
AmazonEchoAlexa-1526053090813.png
Source: T3 Magazine / Contributor

Alexa and Siri Can Hear Things You Can't. That Could Be A Big Problem

By Mark LoProto

 

How much do you trust your in-home artificial intelligence? You casually ask it to play music, turn on your lights, or get you directions to a new restaurant, building a false rapport with it that you maybe mistake as a budding friendship. To a degree, you forget that it’s an easily manipulated machine and not some infallible eButler. A study performed by a group of students from University of California, Berkeley and Georgetown University may unfortunately start to deteriorate that artificial relationship. 

 

Using the white noise that plays over your speakers, the students found a way to embed hidden commands that the artificial assistants would respond to. So far, they’ve been mostly harmless commands like turning airplane mode on or opening a website, but if the ability is there, we all know, in the wrong hands, it can be used maliciously.

The 2016 concluded in a research paper that revealed commands could be inserted directly into music or verbal text without the listener knowing. The fear is that Alexa, Siri, and Google’s assistant could be manipulated into ordering items online, unlocking doors, wiring money, and even turning off alarms.