MIT Technology Review Subscribe

Can 10,000 Humans Clean Up YouTube?

The algorithms aren’t working. YouTube, like Facebook, comes under consistent fire for objectionable content—from extremism to child abuse. Tech leaders promise that artificial intelligence will help solve the problem by automatically identifying offensive content before anyone can see it. But YouTube’s best attempts so far spot only 75 percent of such clips before a user reports them.

Clearly, AI is not yet enough. That message appears to be being reiterated by YouTube’s CEO, Susan Wojcicki. In a new blog post, she explains that the video site is swelling its moderation team to include 10,000 staff across Google to help battle objectionable content.

Advertisement

What will they do? To a large extent, more of the same. “Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content,” she writes. “We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether.”

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

But Wojcicki also points out that all the work has another use: training AIs. “Human judgment is critical to making contextualized decisions on content,” she writes, adding that by collecting data about how its moderators work, YouTube will be able to “train … machine-learning technology to identify similar videos in the future.”

If this all sounds familiar, well, that’s because Facebook added 3,000 extra content policers itself earlier this year. At the time, we argued that extra people alone, without robust and reliable AI, would be unlikely to make much of a dent in offensive content—because there’s so damn much of it to sift through.

In YouTube’s case, with 300 hours of footage uploaded every minute, it seems equally unlikely to succeed—at least until its algorithms have learned all they need from the meatspace moderators.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement