Skip to Content
Silicon Valley

YouTube has nearly halved the number of conspiracy theory videos it recommends

YouTube
YouTubePixabay

The news: YouTube has managed to drastically reduce the number of conspiracy theory videos it recommends, but the total is creeping back up again, according to a new study.

The study: Researchers trained an algorithm to judge the likelihood that a video on the site contained conspiracy theories by looking at the description, transcript, and comments. They examined eight million recommendations over 15 months. They found that shortly after YouTube announced it would recommend less conspiracy content in January 2019, the numbers did indeed gradually drop—by about 70% at the lowest point in May 2019. However, the number of conspiracy videos YouTube’s algorithm recommends has steadily risen again since then. These recommendations are now only 40% less common than when YouTube started its crackdown.

A reminder: YouTube’s recommendations are responsible for almost three-quarters of the more than one billion hours people spend watching videos on YouTube every day.

Uneven progress: Progress has varied by the type of video, the researchers found. YouTube has managed to almost completely scrub some conspiracy theories from its recommendations, including those claiming the US government helped organize 9/11, or that the earth is flat. However, some continue to flourish, including videos espousing climate change denial. The researchers told the New York Times that these findings suggest YouTube has decided which misinformation it will and won’t permit, although it’s yet to disclose any such policy publicly. This is legal, given that YouTube is a private forum and so free speech laws don’t apply, a US appeals court recently ruled.

The algorithm: We don’t fully understand how YouTube’s recommendation algorithms work, and the company regularly tweaks them. YouTube says the recommendation engine’s goal is to “help viewers find the videos they want to watch, and to maximize long-term viewer engagement and satisfaction.” Although previous studies have established that YouTube has played a role in helping radicalize people, it’s proved hard to establish exactly how the site works. That’s partly because it’s virtually impossible to replicate an individual user’s experience there. This study experienced that limitation too: it was able to study the site only from the perspective of someone who’s logged out, which is not how most people use YouTube.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.