Skip to Content
Policy

The US now hosts more child sexual abuse material online than any other country

Experts predict that without new legislation, the problem will only grow.

csam content proliferating in usa concept
Ms Tech

The US hosts more child sexual abuse content online than any other country in the world, new research has found. The US accounted for 30% of the global total of child sexual abuse material (CSAM) URLs at the end of March 2022, according to the Internet Watch Foundation, a UK-based organization that works to spot and take down abusive content. 

The US hosted 21% of global CSAM URLs at the end of 2021, according to data from the foundation’s annual report. But that percentage shot up by nine percentage points during the first three months of 2022, the foundation told MIT Technology Review. The IWF found 252,194 URLs containing or advertising CSAM in 2021, a 64% increase from 2020; 89% of them were traced to image hosts, file-storing cyberlockers, and image stores. The figures are drawn from confirmed CSAM content detected and traced back to the physical server by the IWF to determine its geographical location.

That sudden spike in material can be attributed at least partly to the fact that a number of prolific CSAM sites have switched servers from the Netherlands to the US, taking a sizable amount of traffic with them, says Chris Hughes, director of the IWF’s hotline. The Netherlands had hosted more CSAM than any other country since 2016 but has now been overtaken by the US.

But the rapidly growing CSAM problem in the US is attributable to a number of more long-term factors. The first is the country’s sheer size and the fact that it’s home to the highest number of data centers and secure internet servers in the world, creating fast networks with swift, stable connections that are attractive to CSAM hosting sites.

The second is that the vast scale of CSAM dwarfs the resources dedicated to weeding it out. This imbalance means that bad actors feel they’re able to operate with impunity within the US because the chance of them getting in trouble, even if caught, is “vanishingly small,” says Hany Farid, a professor of computer science at the University of California, Berkeley, and the co-developer of PhotoDNA, a technology that turns images into unique digital signatures, known as hashes, to identify CSAM.

Similarly, while companies in the US are legally required to report CSAM to the National Center for Missing & Exploited Children (NCMEC) once they’ve been made aware of it or face a fine of up to $150,000, they’re not required to proactively search for it.

Besides “bad press” there isn’t much punishment for platforms that fail to remove CSAM quickly, says Lloyd Richardson, director of technology at the Canadian Centre for Child Protection. “I think you’d be hard pressed to find a country that’s levied a fine against an electronic service provider for slow or non-removal of CSAM,” he says. 

The volume of CSAM increased dramatically across the globe during the pandemic as both children and predators spent more time online than ever before. Child protection experts, including the anti-child-trafficking organization Thorn and INHOPE, a global network of 50 CSAM hotlines, predict the problem will only continue to grow. 

So what can be done to tackle it? The Netherlands may provide some pointers. The country still has a significant CSAM problem, owing partly to its national infrastructure, its geographic location, and its status as a hub for global internet traffic. However, it’s managed to make some major headway. It’s gone from hosting 41% of global CSAM at the end of 2021 to 13% by the end of March 2022, according to the IWF.

Much of that progress can be traced to the fact that when a new government came to power in the Netherlands in 2017, it made tackling CSAM a priority. In 2020 it published a report that named and shamed internet hosting providers that failed to remove such material within 24 hours of being alerted to its presence. 

It appeared to have worked—at least in the short term. The Dutch CSAM hotline EOKM found that providers were more willing to take down material quickly, and to adopt measures such as committing to removing CSAM within 24 hours of its discovery, in the wake of the list’s publication. 

However, Arda Gerkens, chief executive of EOKM, believes that rather than eradicating the problem, the Netherlands has merely pushed it elsewhere.  “It looks like a successful model, because the Netherlands has cleaned up. But it hasn’t gone—it’s moved. And that worries me,” she says. 

The solution, child protection experts argue, will come in the form of legislation. Work is underway in the US to amend Section 230 of the Communications Decency Act, a law that broadly means the creators of content online are legally responsible for that content rather than the platforms that host it, although platforms are not protected from federal criminal prosecutions.

Senators have proposed a law called the EARN IT (Eliminating Abusive and Rampant Neglect of Interactive Technologies) Act, which would strip providers of online services of the legal protections Section 230 gives them in cases involving CSAM, meaning individuals could file civil lawsuits against them while simultaneously opening them up to criminal charges under state law.

Privacy and human rights advocates are fiercely opposed to the act, arguing that it could force companies to feel they have no alternative but to host less user-generated content and to abandon end-to-end encryption and other privacy protections. But the flipside to that argument, says Shehan, is that tech companies are currently prioritizing the privacy of those distributing CSAM on their platforms over the privacy of those victimized by it.

Even if the lawmakers fail to pass the EARN IT Act, forthcoming legislation in the UK promises to hold tech platforms responsible for illegal content, including CSAM. The UK’s Online Safety Bill and Europe’s Digital Services Act could cause tech giants to be hit with multibillion-dollar fines if they fail to adequately tackle illegal content when the law comes into force. 

The new laws will apply to social media networks, search engines, and video platforms that operate in either the UK or Europe, meaning that companies based in the US, such as Facebook, Apple, and Google, will have to abide by them to continue operating in the UK. “There’s a whole lot of global movement around this,” says Shehan. “It will have a ripple effect all around the world.”  

“I would rather we didn’t have to legislate,” says Farid. “But we’ve been waiting 20 years for them to find a moral compass. And this is the last resort.” 

Correction: An earlier version of this article incorrectly stated that Section 230 of the Communications Decency Act does not make exceptions for CSAM. That was incorrect, and the article has been updated to reflect that CSAM violates federal law.

Deep Dive

Policy

Is there anything more fascinating than a hidden world?

Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.

Africa’s push to regulate AI starts now        

AI is expanding across the continent and new policies are taking shape. But poor digital infrastructure and regulatory bottlenecks could slow adoption.

Yes, remote learning can work for preschoolers

The largest-ever humanitarian intervention in early childhood education shows that remote learning can produce results comparable to a year of in-person teaching.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.