Don’t Blame Encryption for ISIS Attacks
Let’s start with what we don’t know. No firm details have been released about how the perpetrators of the attacks in Paris last Friday communicated.
All the same, some media outlets, politicians, and security leaders in Europe and the U.S. are now suggesting that the tragic events show how encryption technology has lately made it easier for terrorists to evade the authorities.
Central Intelligence Agency director John Brennan complained about that at an event at the Center for Strategic & International Studies on Monday. “There are a lot of technological capabilities that are available right now that make it exceptionally difficult, both technically as well as legally, for intelligence security services to have insight that they need,” he said.
There is also much chatter about the possibility that the Paris attackers used Sony’s Playstation gaming network to communicate because it offers a very high level of protection against eavesdropping. This is based on a false assertion—now retracted—that a Playstation 4 console was among the items seized in a series of raids this weekend in France and Belgium. (Belgium’s interior minister did say last week that it was “very, very difficult” for intelligence agencies to “decrypt” communications made through Playstations, but he didn’t back up his claim.)
Many security and encryption experts are shaking their heads today because claims that encryption is crippling intelligence agencies and law enforcement have become common over the past year or so. Now as then, these claims are presented without solid evidence.
Encryption technology is pervasive. We use it and are protected by it every day. Requiring certain companies within the sphere of influence of certain governments to remove or weaken their encryption wouldn’t make it harder for a determined person to find alternative ways to secure data or communications. It would strip privacy protections from millions of people and could create dangerous new flaws.
As encryption experts and companies such as Google and Facebook have pointed out, once a government backdoor is created it is very hard to prevent it from being abused by the people it was created for or bad actors that force access to it. When one expert tried to dream up the safest encryption backdoor possible earlier this year, he couldn’t eliminate that problem. That’s why the White House last month backed away from its earlier interest in forcing companies to let U.S. authorities pierce their encryption.
It would be nice if it were possible to solve the complex, messy, and tragic matter of terrorism by telling a few tech companies what to do. As cryptographer Matthew Green of Johns Hopkins University neatly summarizes, it sadly won’t.
I’m kind of sick about the whole thing and hate to state the obvious. But no, shutting down WhatsApp will not stop ISIS. It just won’t.
— Matthew Green (@matthew_d_green) November 16, 2015
Keep Reading
Most Popular
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
Google’s Gemini is now in everything. Here’s how you can try it out.
Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.
This baby with a head camera helped teach an AI how kids learn language
A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.