YouTube’s says it’s going to clean up its act, but don’t expect much
The video site is full of questionable content—but an ex-staffer says the underlying algorithms really need attention.
The problem: Take your pick: conspiracy theories, violence, propaganda, and worse lurk on the site.
A solution? The Wall Street Journal says YouTube will provide more context around videos, warning users when they contain conspiracy theories or are produced by state-funded media outlets. It may show credible videos alongside the problematic ones.
Deeper problems: Guillaume Chaslot, an AI expert who worked on YouTube’s recommendation system, tells the Guardian that its algorithms will keep surfacing questionable content:
“The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for democracy. There are many ways YouTube can change its algorithms to suppress fake news … I tried to change YouTube from the inside but it didn’t work.”
More to be done: YouTube has applied many Band-Aids to content problems. It’s unlikely the new initiative will do much. Perhaps those algorithms need overhauling?
Keep Reading
Most Popular
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.