Skip to Content
Humans and technology

Hated that video? YouTube’s algorithm might push you another just like it.

New research from Mozilla shows that user controls have little effect on which videos YouTube’s influential AI recommends.

September 20, 2022
hand shaped cursor pushes video frames to the right, leaving drag marks
Stephanie Arnett/MITTR | Envato

YouTube’s recommendation algorithm drives 70% of what people watch on the platform.

That algorithm shapes the information billions of people consume, and YouTube has controls that purport to allow people to adjust what it shows them. But, a new study finds, those tools don’t do much. Instead, users have little power to keep unwanted videos—including compilations of car crashes, livestreams from war zones, and hate speech—out of their recommendations.

Mozilla researchers analyzed seven months of YouTube activity from over 20,000 participants to evaluate four ways that YouTube says people can “tune their recommendations”—hitting Dislike, Not interested, Remove from history, or Don’t recommend this channel. They wanted to see how effective these controls really are. 

Every participant installed a browser extension that added a Stop recommending button to the top of every YouTube video they saw, plus those in their sidebar. Hitting it triggered one of the four algorithm-tuning responses every time.

Dozens of research assistants then eyeballed those rejected videos to see how closely they resembled tens of thousands of subsequent recommendations from YouTube to the same users. They found that YouTube’s controls have a “negligible” effect on the recommendations participants received. Over the seven months, one rejected video spawned, on average, about 115 bad recommendations—videos that closely resembled the ones participants had already told YouTube they didn’t want to see.

Prior research indicates that YouTube’s practice of recommending videos you’ll likely agree with and rewarding controversial content can harden people’s views and lead them toward political radicalization. The platform has also repeatedly come under fire for promoting sexually explicit or suggestive videos of children—pushing content that violated its own policies to virality. Following scrutiny, YouTube has pledged to crack down on hate speech, better enforce its guidelines, and not use its recommendation algorithm to promote “borderline” content.

Yet the study found that content that seemed to violate YouTube’s own policies was still being actively recommended to users even after they’d sent negative feedback.

Hitting Dislike, the most visible way to provide negative feedback, stops only 12% of bad recommendations; Not interested stops just 11%. YouTube advertises both options as ways to tune its algorithm. 

Elena Hernandez, a YouTube spokesperson, says, “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers.” Hernandez also says Mozilla’s report doesn’t take into account how YouTube’s algorithm actually works. But that is something no one outside of YouTube really knows, given the algorithm’s billions of inputs and the company’s limited transparency. Mozilla’s study tries to peer into that black box to better understand its outputs.

The tools that work best, the study found, don’t just express a sentiment but give YouTube an order. Remove from history reduced unwanted recommendations by 29%, and Don’t recommend this channel did the best, stopping 43% of bad recommendations. Even so, videos from a channel that viewers have asked YouTube to mute can still appear in their suggestions.

Mozilla’s report speculates that this is because the platform prioritizes watch time over user satisfaction, a metric YouTube’s recommendation algorithm didn’t even consider for the first 10 years of the platform’s history. If YouTube wants to “actually put people in the driver’s seat,” Mozilla says, the platform should allow people to proactively train the algorithm by excluding keywords and types of content from their recommended videos. 

Many of the issues Mozilla’s report raises center on recommendations of potentially traumatizing content. One participant received recommendations for videos demoing guns, even after asking YouTube to stop recommending a very similar video on firearms. And YouTube continued to recommend footage of active fighting in Ukraine to participants who rejected similar content.

Other recommendations were just obnoxious. A crypto get-rich-quick video and an “ASMR Bikini Try-On Haul” are examples of the types of videos users flagged but couldn’t drive out of their recommendations. One participant said, “It almost feels like the more negative feedback I provide to their suggestions, the higher bullshit mountain gets.” Christmas music is another category of recommended content that participants found difficult to escape.

“YouTube has its struggles, like all platforms, with this gap between the rule they have written and their enforcement,” says Mark Bergen, author of Like, Comment, Subscribe, a recent book on YouTube’s rise. “Part of that is just because they're just dealing with such a huge volume of video, and so many different countries and languages.”

Still, Bergen says, YouTube’s AI is powerful enough to offer users tools to shape the content they see. “YouTube likes to say ‘The algorithm is the audience,’” Bergen says. But to him, it’s clear that average users are either not being heard or not being understood.

Deep Dive

Humans and technology

Unlocking the power of sustainability

A comprehensive sustainability effort embraces technology, shifting from risk reduction to innovation opportunity.

Building a more reliable supply chain

Rapidly advancing technologies are building the modern supply chain, making transparent, collaborative, and data-driven systems a reality.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.