Skip to Content

Connected Toys Are Raising Complicated New Privacy Questions

Toys and other devices are collecting loads of data from children. What could go wrong?
July 22, 2016

Talking toys have come a long way since the original Furby. Now they’re connected to the Internet, use speech recognition, and are raising a host of new questions about the online privacy and security of children.

Hackers have already targeted toys. Late last year, Hong Kong-based digital toy maker Vtech admitted that cybercriminals accessed the personal information of 6.4 million children. Researchers have also shown how hackers can gain control of connected dolls. But a number of the privacy-related challenges raised by connected toys are novel. They are collecting new kinds of data, and what’s at stake if something goes wrong is not always clear.

Two of the most prominent examples of the new generation of toys are the Dino, a cloud-connected, Wi-Fi-enabled plastic dinosaur that uses speech recognition technology and IBM’s Watson to “listen” and respond to a child’s words, and Mattel’s Hello Barbie, which also uses speech recognition and uploads voice recordings to the cloud. Both work in way that’s similar to the way virtual assistants like Apple’s Siri and Amazon’s Alexa work.

They’re regulated by a law that now looks outdated. In 1998, the U.S. enacted the Children’s Online Privacy Protection Act, or COPPA, to protect young children from the risks of sharing personal information online. The Federal Trade Commission enforces COPPA, which is designed to give parents control over their children’s data. But some privacy advocates and policymakers argue that COPPA is not clear enough in some cases, given the wide and growing range of technological capabilities of today’s toys.

The Dino, made by a startup called Elemental Path, records what children say and uses IBM’s Watson to respond.

Technically, COPPA applies to online services “directed to children under 13 that collect, use, or disclose personal information from children.” That clearly includes toys like the Dino and Hello Barbie. But should it also apply to other applications like Siri and Alexa, or other data-collecting online services that are popular with children even though they are not exclusively directed at them?

Earlier this month, Virginia Senator Mark Warner sent a letter to the FTC expressing concern over the increasing collection of children’s personal information by apps and toys, and to ask for clarification about how the FTC will enforce COPPA in this environment. “While Congress may have had an inkling of the future growth of web-services in 1998,” Warner wrote, “it certainly did not envision the array of conventional household products that now possess data gathering and processing capabilities.” He asked the commission to clarify how it determines whether a device, website, or app is directed at children. The FTC has not yet responded.

Another set of big questions hovers around consent. Under COPPA, companies must get “verified consent” from a child’s parent before collecting personal information from that child. But getting that consent for connected toys can be much trickier than when Internet just meant browsers and websites.

One of the biggest challenges is that new connected toys and devices often feature small, limited, or disconnected user interfaces, if they have them at all, former FTC commissioner Julie Brill said at a discussion this week in Washington, D.C., focused on children’s online privacy. Brill, now a partner at a D.C. law firm, served as an FTC commissioner from 2010 until last March.

The Dino and Hello Barbie come with applications that parents can use to adjust settings and provide consent to data collection. But what happens when another child comes over, or the child takes the toy to school? Technically it can’t record other children until their own parents give consent as well, said Brill. This will be an “interesting challenge,” and companies will have to come up with creative new ways to get this consent, she said. For example, future products may be able to use voice recognition—not just speech recognition—to identify specific people and refrain from recording those who have not consented.

Finally, thorny ethical dilemmas could arise, especially if young children choose to share very sensitive things. What exactly should a company do when it records a four-year-old saying she’s been abused? Notify the police? What if it’s not true? Questions like these are challenging, but as connected toys and other listening devices become more popular we’ll probably need to answer them.

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.