The sheer scale of efforts by Facebook, YouTube, and Twitter to take down clips of the video shows how hard it is to stop people from spreading horrific content.

The news: Facebook has said that in the first 24 hours after the attack it removed 1.5 million versions of the video filmed by the gunman who killed over 50 people in two mosques in Christchurch, New Zealand. Of those, 1.2 million were blocked while they were uploading, so they never made it onto the site. YouTube and Twitter are yet to release figures.

The gunman live-streamed the shooting over 17 minutes on Facebook, and it was quickly re-posted by people both on that platform and others. There are almost certainly still versions of the video available online, despite the efforts to remove them.

What next: There are growing calls for social-media companies to change their policies after the outrage—but it’s not always clear exactly what that means in practice. Bloomberg reports that New Zealand’s prime minister, Jacinda Ardern, is seeking talks with Facebook over live-streaming but hasn’t set out any specific demands.

Supply and demand: The problem, according to Facebook’s former chief security officer, Alex Stamos, is not just virality. It’s that the biggest tech companies have much less control over whether people in free societies trade data than you might think. It also reflects a systemic issue, which is that social platforms often don’t even see themselves as arbiters of content in the first place. And perhaps a more profound question: do we really want them to be?

Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.