The video site is full of questionable content—but an ex-staffer says the underlying algorithms really need attention.
The problem: Take your pick: conspiracy theories, violence, propaganda, and worse lurk on the site.
A solution? The Wall Street Journal says YouTube will provide more context around videos, warning users when they contain conspiracy theories or are produced by state-funded media outlets. It may show credible videos alongside the problematic ones.
This story is only available to subscribers.
Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.
Subscribe now
Already a subscriber?
Sign in
Deeper problems: Guillaume Chaslot, an AI expert who worked on YouTube’s recommendation system, tells the Guardian that its algorithms will keep surfacing questionable content:
“The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for democracy. There are many ways YouTube can change its algorithms to suppress fake news … I tried to change YouTube from the inside but it didn’t work.”
More to be done: YouTube has applied many Band-Aids to content problems. It’s unlikely the new initiative will do much. Perhaps those algorithms need overhauling?