Some child advocates are clamoring for Mattel to halt production of a new "smart" Barbie that can have conversations with your kid. The Barbie's powered by voice-recognition artificial-intelligence that's becoming increasingly prevalent in our everyday gadgets, but that doesn't mean it doesn't freak people out. Especially when kids are involved.
At last month's Toy Fair, one of the standout new playthings was Hello Barbie, a new version of the classic doll with "brains" that let her listen to a statement or question from a kid and then respond appropriately. Over time, Hello Barbie learns what you like, and tailors conversation to those tastes and preferences. The demos on the show floor were really impressive!
The tech behind Barbie's brains, developed by a company called ToyTalk, works basically just like Apple's Siri voice assistant, except that it's designed to listen to the way kids talk instead of the way adults talk. Though AI of this kind has been around for years, the fact that it's in a toy has both privacy and children's activists all up in a huff.
Campaign for a Commercial-Free Childhood is a US pressure group leading the charge. The group claims that kids shouldn't be playing with toys powered by algorithms because they're not replacement for real interaction with humans. I'm not too sympathetic, as it seems like that's the kind of decision that parents should make, not a toymaker.
But granted, it's kind of creepy to think that a technically lifeless doll is having conversations with your kids, and that everything your kids says is being processed and analysed. What happens to that highly personal data down the road?
The advocate group pulled in a privacy heavy-hitter to help back up their claims.
Georgetown University Law Professor Angela Campbell, Faculty Advisor to the school's Center on Privacy and Technology, said, "If I had a young child, I would be very concerned that my child's intimate conversations with her doll were being recorded and analyzed. In Mattel's demo, Barbie asks many questions that would elicit a great deal of information about a child, her interests, and her family. This information could be of great value to advertisers and be used to market unfairly to children."
Data is an advertising goldmine, but a Mattel spokeswoman who told me that the processed voice recognition tech won't be used to market or sell anything. Not movies, not clothes, not anything. Hello Barbie will know what day of the week it is so she'll be able to talk about, say, Valentine's Day, but she wouldn't talk about an upcoming corporate event. "The data collected will only be used by Mattel to improve this product for girls," the spokeswoman told me.
In a statement, ToyTalk, the company behind the technology concurred: "ToyTalk and Mattel will only use the conversations recorded through Hello Barbie to operate and improve our products, to develop better speech recognition for children, and to improve the natural language processing of children's speech."
Like Siri, the Hello Barbie tech is powered by cloud-based voice recognition and artificial intelligence. When you saying something to Barbie, the Wi-Fi connected toy sends the words up to ToyTalk's server's where it's interpreted. The servers then crunch the query and determine which one of Hello Barbie's preloaded scripts should play back from a speaker in the toy.
But there is another important distinction between ToyTalk tech and the tech found in Siri, Microsoft's Cortana, and Amazon's Echo: ToyTalk never queries the open web. It always sticks to a script.
In other words, in the use of voice recognition technology in Hello Barbie is pretty innocuous. Still the dust-up is emblematic of how people are going to be wary as uncanny voice recognition tech works itself into more and more parts of our life. Sure, in reality, we've been at the mercy of huge data collection systems and learning algorithms for some time. (How do you think Google works?) But as these experiences become increasingly intimate, and indeed, increasingly human, people are going to get freaked out. And that's probably a good thing. Better to take baby steps into the strange future.