MIT Technology Review Subscribe

YouTube’s algorithm makes it easy for pedophiles to find more videos of children

YouTube won’t disable its recommendation algorithm on videos of children, despite the fact that it makes it easier to access sexualized footage of minors.

The news: YouTube’s automated recommendation system has gathered a collection of prepubescent, partially clothed children and is recommending it to people who have watched similar videos, the New York Times reports. While some of the recommendations have been switched off on certain videos, the company has refused to end the practice.

Advertisement

One example: A woman in Brazil uploaded a seemingly innocuous video of her 10-year-old daughter playing with a friend in a backyard pool. A few days later, the video had 400,000 views, largely thanks to YouTube’s automated recommendations.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

YouTube’s response: YouTube disabled comments on many videos of children in February after an outcry over pedophiles using the comment section to guide each other. It doesn’t let kids under 13 open accounts. However, it won’t stop recommending videos of children because it is worried about negative impact on family vloggers, some of whom have many millions of followers. In a blog post responding to the New York Times story, YouTube said that it was “limiting” recommendations on some videos that may put children at risk.

The crux: YouTube’s overriding aim is to keep your eyeballs on it for as long as possible, and its systems reflect that priority. Conspiracy theory videos and far-right content still flourish on its platform, boosted by YouTube’s algorithms. In this case, on balance, the company has decided the risk of upsetting content creators by tweaking its systems outweighs the benefit of not aiding pedophiles.  

Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement