Skip to Content

How an Overreaction to Terrorism Can Hurt Cybersecurity

Encryption could have prevented some of the worst cyberattacks. Giving back doors to law enforcement will make matters worse, argues Bruce Schneier.
January 25, 2016

Many technological security failures of today can be traced to failures of encryption. In 2014 and 2015, unnamed hackers—probably the Chinese government—stole 21.5 million personal files of U.S. government employees and others. They wouldn’t have obtained this data if it had been encrypted.

Many large-scale criminal data thefts were made either easier or more damaging because data wasn’t encrypted: Target, T.J. Maxx, Heartland Payment Systems, and so on. Many countries are eavesdropping on the unencrypted communications of their own citizens, looking for dissidents and other voices they want to silence.

Some law enforcement leaders have proposed adding back doors to encrypted data to allow access for court-authorized investigations, arguing that this will prevent criminals or terrorists from “going dark,” as FBI director James Comey put it in a 2014 Brookings Institution talk (“Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?”). But that approach will only exacerbate the risks.

We can’t build an access system that works only for people with a certain citizenship or a particular morality, or in the presence of a specified legal document. If the FBI can eavesdrop on your text messages or get at your computer’s hard drive, so can other governments. So can criminals. So can terrorists. If you want to understand the details, read a 2015 paper coauthored by MIT professor Hal Abelson, called “Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications.”

The debate over whether law enforcement should gain access to encrypted messages and other data reëmerged in light of the Paris terror attacks and others. But it’s a false choice to say you can have either privacy or security. The real choice is between having less security and having more security. Of course, criminals and terrorists have used—are using, will use—encryption to hide their planning from the authorities, just as they will use society’s amenities and infrastructure: cars, restaurants, telecommunications. In general, we recognize that such things can be used by both honest and dishonest people. Society thrives nonetheless, because the honest so outnumber the dishonest.

The security technologist Bruce Schneier is the author most recently of Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.