The video site is full of questionable content—but an ex-staffer says the underlying algorithms really need attention.
The problem: Take your pick: conspiracy theories, violence, propaganda, and worse lurk on the site.
A solution? The Wall Street Journal says YouTube will provide more context around videos, warning users when they contain conspiracy theories or are produced by state-funded media outlets. It may show credible videos alongside the problematic ones.
Deeper problems: Guillaume Chaslot, an AI expert who worked on YouTube’s recommendation system, tells the Guardian that its algorithms will keep surfacing questionable content:
“The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for democracy. There are many ways YouTube can change its algorithms to suppress fake news … I tried to change YouTube from the inside but it didn’t work.”
More to be done: YouTube has applied many Band-Aids to content problems. It’s unlikely the new initiative will do much. Perhaps those algorithms need overhauling?
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.