How an Overreaction to Terrorism Can Hurt Cybersecurity
Many technological security failures of today can be traced to failures of encryption. In 2014 and 2015, unnamed hackers—probably the Chinese government—stole 21.5 million personal files of U.S. government employees and others. They wouldn’t have obtained this data if it had been encrypted.
Many large-scale criminal data thefts were made either easier or more damaging because data wasn’t encrypted: Target, T.J. Maxx, Heartland Payment Systems, and so on. Many countries are eavesdropping on the unencrypted communications of their own citizens, looking for dissidents and other voices they want to silence.
Some law enforcement leaders have proposed adding back doors to encrypted data to allow access for court-authorized investigations, arguing that this will prevent criminals or terrorists from “going dark,” as FBI director James Comey put it in a 2014 Brookings Institution talk (“Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?”). But that approach will only exacerbate the risks.
We can’t build an access system that works only for people with a certain citizenship or a particular morality, or in the presence of a specified legal document. If the FBI can eavesdrop on your text messages or get at your computer’s hard drive, so can other governments. So can criminals. So can terrorists. If you want to understand the details, read a 2015 paper coauthored by MIT professor Hal Abelson, called “Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications.”
The debate over whether law enforcement should gain access to encrypted messages and other data reëmerged in light of the Paris terror attacks and others. But it’s a false choice to say you can have either privacy or security. The real choice is between having less security and having more security. Of course, criminals and terrorists have used—are using, will use—encryption to hide their planning from the authorities, just as they will use society’s amenities and infrastructure: cars, restaurants, telecommunications. In general, we recognize that such things can be used by both honest and dishonest people. Society thrives nonetheless, because the honest so outnumber the dishonest.
The security technologist Bruce Schneier is the author most recently of Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.