The news: Social-media companies should be forced to share data about their users and how they use their products as part of research aimed at reducing rates of suicide and self-harm among young...
Hand it over: The college, which represents psychiatrists working in the UK, has called on the government to compel Twitter, Facebook, and Instagram to hand over data to academics on the type of material users are viewing, and the amount of time they are spending on the platforms. Although most children and young people will be able to benefit from technology without negative effects, some may be vulnerable to compulsive use and potential harms, the report says. The college said any data shared would be anonymized (although this is trickier to guarantee than it sounds).
The context: The UK government is in the process of setting up up an online safety regulator, and the college says this body should be given the power to force companies to hand over data. It suggested this research could be paid for if a forthcoming 2% “digital services tax” on tech companies in the UK were extended to include international transactions.
Is this a good idea? Undoubtedly, it would be useful if researchers were able to get a better idea of what link, if any, there may be between scrolling through harmful social-media posts and self-harm or even suicide among young people. However, tech companies are unlikely to share this data. They have little incentive to, and could expose themselves to risks if they do: individual users could potentially be reidentified from the data, for example, or people might choose to take legal action as a result of the findings.
Almost two years ago, a pair of Apple investors wrote an open letter to the company calling for it to do more to protect children from the supposedly damaging effects of digital technology. Not much changed as a result. And it’s worth remembering that for all the panic about screen time and children, there’s still remarkably little evidence of a causal link with poor mental health. In other words, kids who are more depressed and anxious could be choosing to spend more time on their smartphones, rather than the phones causing them to be more anxious and depressed.
Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.
The news: The European Commission is considering a ban of facial recognition in public places for up to five years, with exceptions for research and security projects, according to a white paper draft...
The background: Activists on both sides of the Atlantic have been concerned about facial recognition, saying that the technology isn’t accurate for women and people of color and can be used to spy on people without their consent. The European data protection supervisor has written that turning a human face into an object for powerful companies and governments to measure may infringe on human dignity. A UK survey found that 46% of the public thought they should be able to opt out of facial recognition.
In the US, cities such as San Francisco and Somerville, Massachusetts, have banned government use of facial recognition. Activists are working to ban private use of the technology as well though, interestingly, a Pew Research poll found that most Americans are more okay with police using the technology than companies.
Is a temporary ban a good idea? Yes, especially given the breakneck pace at which the technology is being deployed in Europe, by everyone from police forces to supermarkets. Recently, both France and Sweden stopped schools from installing facial recognition on their grounds. And taking the time to assess the impacts of the technology is safer than undoing what has already been done. The European Commission suggestion (which might change when the final paper is released in February) is also stronger than the positions of many candidates in the 2020 US presidential election, most of whom have called for task forces to evaluate facial recognition in policing instead of advocating an outright moratorium in public spaces.