MIT Technology Review Subscribe

YouTube’s says it’s going to clean up its act, but don’t expect much

The video site is full of questionable content—but an ex-staffer says the underlying algorithms really need attention.

The problem: Take your pick: conspiracy theories, violence, propaganda, and worse lurk on the site.

Advertisement

A solution? The Wall Street Journal says YouTube will provide more context around videos, warning users when they contain conspiracy theories or are produced by state-funded media outlets. It may show credible videos alongside the problematic ones.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Deeper problems: Guillaume Chaslot, an AI expert who worked on YouTube’s recommendation system, tells the Guardian that its algorithms will keep surfacing questionable content:

“The recommendation algorithm is not optimising for what is truthful, or balanced, or healthy for democracy. There are many ways YouTube can change its algorithms to suppress fake news … I tried to change YouTube from the inside but it didn’t work.”

More to be done: YouTube has applied many Band-Aids to content problems. It’s unlikely the new initiative will do much. Perhaps those algorithms need overhauling?

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement