YouTube won’t disable its recommendation algorithm on videos of children, despite the fact that it makes it easier to access sexualized footage of minors.
The news: YouTube’s automated recommendation system has gathered a collection of prepubescent, partially clothed children and is recommending it to people who have watched similar videos, the New York Times reports. While some of the recommendations have been switched off on certain videos, the company has refused to end the practice.
One example: A woman in Brazil uploaded a seemingly innocuous video of her 10-year-old daughter playing with a friend in a backyard pool. A few days later, the video had 400,000 views, largely thanks to YouTube’s automated recommendations.
YouTube’s response: YouTube disabled comments on many videos of children in February after an outcry over pedophiles using the comment section to guide each other. It doesn’t let kids under 13 open accounts. However, it won’t stop recommending videos of children because it is worried about negative impact on family vloggers, some of whom have many millions of followers. In a blog post responding to the New York Times story, YouTube said that it was “limiting” recommendations on some videos that may put children at risk.
The crux: YouTube’s overriding aim is to keep your eyeballs on it for as long as possible, and its systems reflect that priority. Conspiracy theory videos and far-right content still flourish on its platform, boosted by YouTube’s algorithms. In this case, on balance, the company has decided the risk of upsetting content creators by tweaking its systems outweighs the benefit of not aiding pedophiles.
Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.