Skip to Content

In Apple vs. the FBI, There Is No Technical Middle Ground

Either you let law enforcement get into encrypted devices and run the risk of letting others find a similar way in, or you don’t.
March 2, 2016

Policymakers say they want a compromise on law enforcement access to smartphones, but legal and technical experts say it’s tough to discern what that might look like in practice. Put simply, they say any process that makes it possible for a company to decrypt an encrypted phone—like the method the FBI advocates in its standoff with Apple over a dead terrorist’s iPhone—will inevitably be shared and make encryption less meaningful, breaking down trust and security.

The U.S. Congress is already weighing what to do. A commission proposed last week by Senator Mark Warner, a Virginia Democrat, and Representative Michael McCaul, a Texas Republican, would aim to generate “viable recommendations on how to balance competing digital security priorities.” Modeled on the panel Congress formed to investigate the security and intelligence flaws that led up to the attacks of September 11, 2001, the panel would include law enforcement officials, cryptographers, and representatives of technology companies.

But it’s hard to envision a way for Apple or other companies to selectively undo encryption protections without gutting the concept of having encryption, says Bruce Schneier, a cryptographer and security expert. “I can't think of any [compromise]. Either Apple weakens security or they do not. There's no weakening security halfway,” he says. He says he’s unaware of a technical proposal by anyone that would make this possible.

A federal magistrate has ordered Apple to create special software to defeat protections on the iPhone that was left behind by Syed Rizwan Farook, perpetrator of a December attack that killed 14 people in San Bernardino, California. The change would allow the FBI to try multiple password combinations and thus decrypt the phone. Apple is appealing the order, using a defense that was supported Monday when a judge in New York rejected a similar FBI request in a different case. But that ruling did not void the California case or change the larger debate.

Apple’s technology works like this: any iPhone running iOS 8 or later has its contents encrypted on the phone’s hard drive. It is decrypted only when a user enters a passcode. This passcode, when added to the iPhone’s unique ID number, unknown to Apple, forms a “key” that unlocks the contents of the hard drive.

To protect the password, and thus the encryption, against so-called “brute force attacks” in which every possible combination is tried, Apple has added software-based protections. Users can select an option that would delete the contents of the phone after 10 failed passcode entries. And the passcode must be entered on the device’s touch screen, not some other way. The magistrate ordered Apple to remove these limitations.

The FBI says the request governing Farook's phone would force Apple to create software that would apply only to that device. It offered to let Apple install the custom version of iOS itself to ensure it does not leave Apple’s campus. (The FBI is not actually asking Apple to defeat encryption itself, just the software password-guessing function.)

But while the proposal might conjure images of a clean room in Cupertino where such things could be done—and the code that makes it possible can’t get out—that’s not feasible in practice, says Andy Sellars, a lawyer specializing in technology issues at the Cyberlaw Clinic at Harvard Law School. “How long will that room really stay clean? The privacy benefit right now comes from the fact that nobody knows how to do this. Not Apple, not the FBI, and we think not the NSA, though maybe they do,” he says. “As soon as Apple does this, there’s no way this wouldn’t get out, be stolen, be leaked. There is no way that would stay a secret.”

Even Michael Hayden, former director of the National Security Agency, said in a recent interview with USA Today that it was hard to see how entry points for law enforcement wouldn’t end up being used by others. “Backdoors are good. Please, please, Lord, put backdoors in because I and a whole bunch of other talented security services around the world—even though that backdoor was not intended for me—that backdoor will make it easier for me to do what I want to do, which is to penetrate,” he said in the interview. “But when you step back and look at the whole question of American security and safety writ large, we are a safer, more secure nation without backdoors.”

Of course, the fact that a workaround is even possible—albeit with special effort at Apple—arguably reflects the existence of a vulnerability in iOS 8 software encryption. But future versions could be even tighter and prevent this kind of workaround.

At a panel discussion last week about the prospect of a compromise, Susan Hennessey, a fellow at the Brookings Institution and previously a lawyer at the NSA, said Congress would have to ask what kinds of tools it expects law enforcement to use, and what expectations it has for companies to help them do that.

“It’s helpful to understand that while there is not a 100 percent solution, walking away from this problem is not something that is going to happen,” she said. “Because there are federal agencies with a job to do and the American people expect them to do it.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.