Skip to Content
Silicon Valley

The mass shooting in New Zealand shows how broken social media is

March 15, 2019

A gunman live-streamed the murder of dozens of innocents in two mosques in Christchurch, New Zealand, on Friday—and the world got a terrible reminder of how flawed existing social-media policies and algorithms are for policing violent and offensive content.

In the days before the shooting, the perpetrator apparently boasted of his plans and posted an online manifesto. He then broadcast the horrific act live on Facebook. The attack left 49 people dead and dozens more injured.

Live stream: Over the past 18 months, following harassment and fake-news scandals, social-media companies have invested heavily in content moderators. But this did little to stop video of the shooting from spreading. Not only was the live stream reportedly up for 20 minutes, but the resulting video was then reposted on YouTube, with some clips remaining up for over an hour.

Several factors contributed to letting the footage slip through the filters, according to experts.

Real-time challenge: It’s vital to catch a video quickly, so that it doesn’t spread onto other platforms. But social-media moderation simply isn’t geared toward catching content in real time. It is impossible to automate the process effectively, and identifying live streams that need to be shut down manually is “like finding a needle in the haystack of data that’s flowing over the network all the time,” says Charles Seife, a professor at NYU’s School of Journalism. He adds that Facebook could require users to build up a reputation before letting them live-stream content, to reduce the risks.

Whack-a-mole: Moderators are overwhelmed at the best of times. Video of the shooting hosted on YouTube most likely spread so quickly that the humans employed to check for inappropriate content didn’t have time to catch everything. These workers typically have a few seconds to make a call. The process can be partly automated, but those who reposted the footage apparently clipped it and introduced distortions to avoid these algorithms.

Algorithmic failure: Social-media companies also use algorithmic tweaks to de-prioritize suspicious content. But Mike Ananny, an associate professor at the University of Southern California, says these algorithms were probably thrown by the popularity of the offending videos.

Not our problem: These factors reflect the key systemic problem: Facebook, YouTube, and other big social platforms do not see themselves as the arbiters of content in the first place. Research has shown that far-right sources of information can be policed more proactively to prevent violent or hateful material from spreading. “They have this attitude of being post hoc,” says Ananny. “It’s a deep cultural thing.”

Keep Reading

Most Popular

Rendering of Waterfront Toronto project
Rendering of Waterfront Toronto project

Toronto wants to kill the smart city forever

The city wants to get right what Sidewalk Labs got so wrong.

windows desktop with anime image from Wallpaper Engine
windows desktop with anime image from Wallpaper Engine

Chinese gamers are using a Steam wallpaper app to get porn past the censors

Wallpaper Engine has become a haven for ingenious Chinese users who use it to smuggle adult content as desktop wallpaper. But how long can it last?

Yann LeCun
Yann LeCun

Yann LeCun has a bold new vision for the future of AI

One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.

Linux hack concept
Linux hack concept

The US military wants to understand the most important software on Earth

Open-source code runs on every computer on the planet—and keeps America’s critical infrastructure going. DARPA is worried about how well it can be trusted

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.