Is an Internet clampdown coming? Very possibly – unless Internet Service Providers (ISPs) and Web hosting companies begin to behave more responsibly.
Pointing to the growing trends of online fraud, copyright infringement, hacker attacks, and spam, Jonathan Zittrain, codirector of the Berkman Center for Internet and Society at Harvard Law School, says law enforcement, government, and the online industry itself will eventually foist upon us a more ”battened-down” Internet – one that’s more secure, perhaps, but also far less flexible and amenable to creativity and commercial opportunity. In other words, the technology promise of the Internet is under serious threat. This month, Technology Review adds another item to Zittrain’s list of the Internet’s dysfunctions: its increasingly effective exploitation by terrorist organizations.
The several facets of this deeply troubling trend are described in “Terror’s Server.” Law enforcement officials are increasingly worried that online fraud is funding terrorist atrocities: Imam Samudra, convicted mastermind of the Bali disco bombing of October 2002, has penned a jailhouse memoir that offers a primer on online fraud for his fellow terrorists. More broadly, jihadists are making good use of the Net, with websites that recruit members, solicit funds, and promote violence. Finally, the Internet is enabling a ghastly new spectacle: tens of millions – perhaps hundreds of millions – of people around the world have gone online to watch struggling hostages in Iraq have their heads sawed off.
There are some possible technological fixes for security agencies and others who want to deprive terrorists of the use of the Internet. These include chat-room screening algorithms and new anti-fraud measures that authenticate e-mail. But Web hosting companies might also consider exercising more editorial judgment about online content. While any efforts at blocking and filtering pose technological challenges and would raise libertarian ire, surely these companies could be more vigilant and at least try to limit the posting and propagation of the most offensive and violent material.
This would not restrict free speech in any novel way. In most cases, it would require only the enforcement of existing terms of service. These terms generally say users must not post any kind of hate speech, racist comments, violent images, or illegal content like child pornography or copyrighted material. Currently, Web hosting companies enforce these terms only when people complain. We recognize that screening content up front, as broadcasters and editors do in traditional media, would be extremely difficult, if not impossible. But ISPs and Web hosting companies increasingly seem to want to be new-media broadcasters without assuming the responsibility that that entails – and its costs.
If the online industry doesn’t take a more active stance, events may overtake it. Laws tend to change when some awful event creates public demand for legislation. In the United States, September 11 created the Patriot Act, with its erosion of civil liberties and privacy. ISPs should begin to behave more like traditional broadcasters, screening obscenely violent content – and they should do it soon. If they don’t, it’s not inconceivable that, some day, legislative bodies – spurred by some online outrage yet to come – will do it for them.
Videos of politically motivated beheadings are now freely available online. They are posted to inflame the zealous and titillate the jaded. Only the most ardent of libertarians would hesitate to join us in saying, Enough.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Chinese hackers disguised themselves as Iran to target Israel
But they left a few clues that gave them away.
DeepMind says it will release the structure of every protein known to science
The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.