Who Hacked Barbie?Creepy Talking Toy Triggers Security Warnings
Toymaker Mattel is under fire for manufacturing a "Hello Barbie" doll that can "listen" to what kids are saying, transmit it to a Siri-like cloud service for voice processing, and then talk back.
Leaving aside the social history of Barbie - summary: German gag-gift doll for men, reinvented in 1959 as a wholesome American icon - the toy is triggering warnings from information security experts, who caution that anything that connects to the Internet can, and will, be hacked.
I don't care about whether she sells; I care about whether she's secure enough for our kids to play with.
Consumer rights experts are also up in arms over Hello Barbie, with nonprofit group Campaign for a Commercial-Free Childhood labeling the $75 toy "creepy." They've called on Mattel to cancel the doll's scheduled fall 2015 debut.
"If I had a young child, I would be very concerned that my child's intimate conversations with her doll were being recorded and analyzed," privacy law expert Angela Campbell, a professor at Georgetown University, says in a statement. "In Mattel's demo, Barbie asks many questions that would elicit a great deal of information about a child, her interests and her family. This information could be of great value to advertisers and be used to market unfairly to children."
Internet of (Dangerous) Things
Such warnings have accompanied nearly every Internet of Things device, demonstrating how too many manufacturers prioritize speed to market over locking down the security and privacy of data being handled by their products. Indeed, everything from Internet-connected cars and smart TVs to routers and talking dolls can, and sometimes have, already been hacked.
"Hello Barbie" Does a Siri
Mattel announces the world's first interactive Barbie doll, which "learns" about kids, the more they speak to her.
Cayla Doll Hacked
In fact, the concerns about how Barbie might be hacked follow recent warnings about the Cayla doll, which can be paired with a smartphone, and a dedicated app used to process what's said to the doll, and respond. But Ken Munro, a partner at U.K.-based penetration testing firm Pen Test Partners, has demonstrated how easily the doll can be hacked by a local attacker and her vocabulary altered, thanks in part to the doll using no Bluetooth authentication when pairing to a smartphone or tablet that runs her app. "Makes her completely promiscuous, so anyone that is in Bluetooth range can connect to her," Munro says in a blog post.
Cue the potential for an "evil brother" Bluetooth pairing, which would allow the attacker to interface the doll with their customized API - and vocabulary - or else simply use the doll's Bluetooth capabilities. "Whether we use the app to talk to the child, or simply use Cayla as a Bluetooth headset - mike and speaker - we still achieve the desired effect - control over what the doll says and hears," Munro says. "Scary."
Munro hasn't tested Hello Barbie yet, but says it's likely susceptible to similar attacks. "I don't care about whether she sells; I care about whether she's secure enough for our kids to play with, and whether the kid in me can make her swear," he says.
In the battle of Barbie versus Cayla, however, Munro notes that at least Barbie doesn't seem to be perpetually eavesdropping on children. "Fashion-conscious Barbie also appears to be security conscious - her belt buckle is a push-to-talk device - she only listens when you want her to," he says. In addition, he says that having the doll transmit its audio to a server - as opposed to having it reside on a local database - is potentially "a good thing," since the data could be encrypted in transit to the server and back.
Should Toys Talk Back?
But Hello Barbie's upcoming debut begs the question of whether such toys are suitable for children. "Kids don't/shouldn't need talking apps/toys," says Sean Sullivan, security advisor at Helsinki, Finland-based anti-virus vendor F-Secure. But he says Barbie and Cayla aren't the only culprits here, and calls out such apps as Android's My Talking Tom, which is billed as software that will repeat back everything a child says.
Are we ready for a dystopia in which hackers potentially subvert our children's toys and how they talk back? Culturally speaking, we've been down this path before, and as dolls such as Chucky and Annabelle have demonstrated, it's filled with nightmares.