Back in 2015, privacy campaigners became aware of a new Wi-Fi-connected toy that raised considerable concern. At issue was Hello Barbie, a doll with speech recognition technology that could hold a two-way conversation with a child.
Parents and others grew concerned when it became clear that children’s conversations would be stored on cloud servers and used in various ways by Mattel, the toy maker. At the time, Forbes reported that the toy’s terms and conditions allowed the sharing of “audio recordings with third party vendors who assist us with speech recognition.”
This system had the potential to reveal and share a child’s innermost thoughts. And it raised a wide range of ethical questions. For example, what is the appropriate response if a child asks “What should I be when I grow up?”
The episode is emblematic of a much bigger question: How should children’s interests be represented in the debate about privacy and big data?
Today, Gabrielle Berman and Kerry Albright from the UNICEF Office of Research in Florence, Italy, argue that children’s rights have been underrepresented in this area. “Due to the potential for severe, long-lasting and differential impacts on children, child rights need to be firmly integrated onto the agendas of global debates about ethics and data science,” they say.
Privacy issues are always complex, but they have greater relevance for children than ever before. Data is being collected and processed on a previously unimaginable scale that is growing at a fantastic rate. “This accumulation implies that more data will be collected on children over their lifetime than ever before,” say Berman and Albright.
Clearly, there will be benefits. Health experts hope to use this data to personalize and improve medicine, for example. Others hope to deliver better services tailored more precisely to each person’s needs. The next generation has the most to gain from these benefits
But there are disadvantages, too. One problem is the persistence of data—the information gathered about children and teenagers could be tied to them by third parties throughout their lives.
This is being addressed by the “right to be forgotten,” which allows people in Europe to have historical information about them deleted in certain circumstances. Indeed, there are special provisions within European legislation about how this applies to information about children.
Another concern is the spread of data beyond the parties who have collected it. Though anonymization techniques often prevent the data from being linked to specific individuals, there are various ways in which data can later be de-anonymized.
Then there are the unknown consequences of future data processing techniques. Nobody is sure how the data that is gathered today will be used in the future.
For example, social services in countries such as New Zealand and the U.S. already use data gathered about families to identify children who are at risk. Certain educational establishments use data gathered about students to predict how well they will do and to make decisions about their future.
It is not at all clear that these applications were apparent when the data was gathered. An important question is whether the actions taken as a result of this data processing are themselves creating undesired outcomes.
Finally, there is the issue of informed consent. In Europe, parents of children under 13 must give consent in order for data to be collected. But there is less protection for older children. An important issue is how to present children with the information they need to decide whether to accept terms and conditions, and how this should change as they get older. This is particularly tricky when future uses of the data are unknown.
Berman and Albright say there needs to be a significant effort to better represent the interests of children in this debate, particularly when children in some parts of the world are significantly less well protected than others. “In an era of increasing dependence on data science and big data, the voices of one set of major stakeholders—the world’s children and those who advocate on their behalf—have been largely absent,” they say.
That’s troubling, and a good reason to refocus efforts now. As Berman and Albright conclude: “There is no better time to encourage greater debate and dialogue between the child rights and data science communities for the betterment of the lives of children worldwide, than now.”
Ref: arxiv.org/abs/1710.06881 : Children and the Data Cycle: Rights and Ethics in a Big Data World
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.