How Burger King Revealed the Hackability of Voice Assistants

Burger King pulled a pretty juicy marketing stunt last month that drew plenty of attention -- not just to the Whopper, but also to the intrinsic vulnerabilities of a new type of voice-activated gadget.

The fast food chain's 15-second television ad targeted Google Home, a speaker that can answer questions and control other smart appliances. When an actor in the ad said "OK, Google" and asked a question about the Whopper, Google Home obediently began reading the burger's ingredients in homes around the country -- effectively extending the commercial for however long it took someone to shout "OK, Google, stop!"

Google and Wikipedia quickly made fixes to shut it down. Though annoying, the stunt may have done some good by highlighting how easy it is to hijack such devices. (Just imagine a burglar spying a voice assistant and asking it to unlock all the doors.) It could also speed the development of home voice assistants with better security.

"It's a wakeup call," said Earl Perkins, a digital security analyst at the research firm Gartner. "It's a harbinger of things to come."

Trigger Warning

Voice assistants such as Google Home, Apple's Siri and Amazon's Echo devices have always been susceptible to accidental hijack. A Google ad during the Super Bowl that used the phrase "OK, Google" reportedly set off people's devices. And in a January story that briefly turned a family into media celebrities, a woman's 6-year old daughter ordered a dollhouse and sugar cookies simply by asking Amazon's voice assistant Alexa for them.

Since the devices are so new -- the Amazon Echo debuted in 2015, Google Home last year -- they're still having growing pains. And they're growing in popularity; Consumer Intelligence Research Partners estimates that Amazon sold 3 million Echo devices in the U.S. in the fourth quarter of 2016, bringing the total to...

Comments are closed.