Skip to Content

Why 3,000 (More) People Won’t Fix Facebook’s Violent Video Problem

The social network is using manpower to get footage of grisly acts off its site, but that may not be enough.

Facebook has a video problem. Certain clips recently posted by users have been horrifically violent, such as live videos posted last week in which a man in Thailand reportedly killed his infant daughter and then himself. And they aren’t always removed from the site quickly, either, which means many people may end up seeing them.

In an effort to get this content off Facebook faster, CEO Mark Zuckerberg said in a post Wednesday that the social network will make it easier for users to report inappropriate videos, and it is hiring more people to review such reports. The team, called community operations, will grow by 3,000, Zuckerberg said; it currently has 4,500 people reviewing the millions of reports that Facebook gets weekly.

Zuckerberg said the additional manpower will help Facebook remove videos with content including hate speech and exploitation of children.

The move comes as the latest attempt by the social network to fix the issue (see “Offensive Content Still Plagues Facebook”). Two months ago Facebook announced tools it hoped would prevent people from killing themselves on live videos (see “Big Questions Around Facebook’s Suicide Prevention Tools”). Since launching Facebook Live, which lets users broadcast live videos to friends on Facebook, about a year ago, several people have killed themselves via streaming video.

Such moves may not be enough to stanch the flow of violent videos—both streamed live via Facebook Live and in those that are recorded and then uploaded—that are posted to the site, though. There are nearly two billion people using Facebook at this point, and a recent tally by the Wall Street Journal found that at least 50 violent acts have been streamed just via Facebook Live since it was launched a year ago.

This number is almost certain to increase if more Facebook users gravitate to making and watching videos, and chances are they will. In the company’s last quarterly conference call, back in February, Zuckerberg called video a “megatrend,” and, more widely, a recent report from Cisco indicated that mobile video traffic now makes up 60 percent of all mobile data traffic.

The use of artificial intelligence tools could help—Facebook is already adept at using AI to do things like figure out who specific people are in the photos you upload—but even Zuckerberg believes that’s a long way off.

In a long piece he posted to the social network in February, he said the company is looking into technology that can automatically flag photos and videos that shouldn’t be on the site, and said about a third of the reports to Facebook’s content-reviewing team currently come from AI-based alerts.

“It will take many years to fully develop these systems,” he said.

For now, at least, Zuckerberg is hoping that people can do the job that technology can’t. On Wednesday, he tried to point to a bright spot, though, saying that a week earlier the company was able to help stop someone from committing suicide on live video, as the company contacted police in response to a user report.

“In other cases, we weren’t so fortunate,” he said.