Apple is fighting a court order issued Tuesday that demands the company “unlock” an encrypted iPhone owned by one of the perpetrators of the terrorist attacks in San Bernardino, California, in December. This sets up a long-anticipated showdown between the technology industry and law enforcement over whether the government should gain back-door access to encrypted consumer devices.
Apple is a leader in providing encryption on its popular devices; the newest versions of the company’s operating system encrypts data by default on products like iPads and iPhones, and also encrypts communications so that only the sender and receiver can see it.
Law enforcement has long held that this is a major stumbling block to investigations because it means criminals can “go dark,” but others argue that as a practical matter there are many other ways to track suspects. What’s more, creating back doors could aid repressive governments, spies, and criminals.
Last year top U.S. antiterrorism and crime-fighting agencies said they wanted access, but the White House later backed off, creating an impasse. But the November terrorist attacks in Paris and the shootings in San Bernardino triggered new calls by law enforcement and political leaders.
In particular, the FBI complained that agents couldn’t see the contents of an iPhone5C used by Syed Rizwan Farook, one of the shooters in San Bernardino.
The federal magistrate who issued the order, Sheri Pym, didn’t order Apple to turn off its encryption but, rather, to make it easier for federal agents to randomly guess the device’s passcode. Apple’s iPhones currently thwart such efforts by adding a delay when it detects someone is trying to use such a “brute force” attack on a phone. The company has bragged it would take five and a half years for someone to guess every possible code of six numbers or lowercase letters.
The magistrate’s order says that if Apple removes these delays, “it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.” The magistrate also demanded that Apple turn off any “auto-erase” functions, which are designed to erase the phone’s data if someone guesses too many incorrect passwords.
One technical expert says it is possible for Apple to create a custom version of its operating system to specifically deal with the FBI’s request on only Farook’s phone while limiting the risk of creating a larger security vulnerability.
But in his letter to customers, Apple CEO Tim Cook said the request amounted to asking that it break encryption. “Building a version of iOS that bypasses security in this way would undeniably create a backdoor,” he said. He vowed to fight the order, making it possible that new case law be set on the matter as it works its way through the courts.
How AI is reinventing what computers are
Three key ways artificial intelligence is changing what it means to compute.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Surgeons have successfully tested a pig’s kidney in a human patient
The test, in a brain-dead patient, was very short but represents a milestone in the long quest to use animal organs in human transplants.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.