Apple is fighting a court order issued Tuesday that demands the company “unlock” an encrypted iPhone owned by one of the perpetrators of the terrorist attacks in San Bernardino, California, in December. This sets up a long-anticipated showdown between the technology industry and law enforcement over whether the government should gain back-door access to encrypted consumer devices.
Apple is a leader in providing encryption on its popular devices; the newest versions of the company’s operating system encrypts data by default on products like iPads and iPhones, and also encrypts communications so that only the sender and receiver can see it.
Law enforcement has long held that this is a major stumbling block to investigations because it means criminals can “go dark,” but others argue that as a practical matter there are many other ways to track suspects. What’s more, creating back doors could aid repressive governments, spies, and criminals.
Last year top U.S. antiterrorism and crime-fighting agencies said they wanted access, but the White House later backed off, creating an impasse. But the November terrorist attacks in Paris and the shootings in San Bernardino triggered new calls by law enforcement and political leaders.
In particular, the FBI complained that agents couldn’t see the contents of an iPhone5C used by Syed Rizwan Farook, one of the shooters in San Bernardino.
The federal magistrate who issued the order, Sheri Pym, didn’t order Apple to turn off its encryption but, rather, to make it easier for federal agents to randomly guess the device’s passcode. Apple’s iPhones currently thwart such efforts by adding a delay when it detects someone is trying to use such a “brute force” attack on a phone. The company has bragged it would take five and a half years for someone to guess every possible code of six numbers or lowercase letters.
The magistrate’s order says that if Apple removes these delays, “it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.” The magistrate also demanded that Apple turn off any “auto-erase” functions, which are designed to erase the phone’s data if someone guesses too many incorrect passwords.
One technical expert says it is possible for Apple to create a custom version of its operating system to specifically deal with the FBI’s request on only Farook’s phone while limiting the risk of creating a larger security vulnerability.
But in his letter to customers, Apple CEO Tim Cook said the request amounted to asking that it break encryption. “Building a version of iOS that bypasses security in this way would undeniably create a backdoor,” he said. He vowed to fight the order, making it possible that new case law be set on the matter as it works its way through the courts.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.