Skip to Content

What If Apple Is Wrong?

Phones that lock away everything they hold could inhibit law enforcement more than we really want.

Soon after Devon Godfrey was shot to death in his apartment in Harlem on the evening of April 12, 2010, officers with the New York Police Department thought they knew who did it. Security cameras had captured a man entering and exiting the apartment around the time that they thought Godfrey was killed. They arrested the suspect on a charge of murder.

At that point in a case, prosecutors in New York have less than a week to gather the facts needed to persuade a grand jury to indict the suspect. So the prosecutor on this case, Jordan Arnold of the Manhattan District Attorney’s Office, worked through a weekend to dig deeper into the evidence.

Cell phones had been found in Godfrey’s apartment, including an iPhone that was locked by its passcode. Arnold recalls doing what he always did in homicides back then: he obtained a search warrant for the phone and put a detective on a plane to Cupertino, California. The detective would wait in Apple’s headquarters and return with the data Arnold needed. Meanwhile, investigators looked more closely at the apartment building’s surveillance video, and Arnold examined records sent by Godfrey’s wireless carrier of when calls and texts were last made on the phones.

Things Reviewed

  • Don’t Panic: Making Progress on the “Going Dark” Debate

    The Berkman Center for Internet & Society
    February 2016

  • The Ground Truth About ­Encryption and the ­Consequences of Extraordinary Access

    The Chertoff Group
    March 2016

  • Report of the Manhattan District Attorney’s Office on Smartphone Encryption and Public Safety

    November 2015

  • Keys Under Doormats: ­Mandating Insecurity by ­Requiring ­Government Access to All Data and Communications

    Computer Science and Artificial Intelligence Laboratory, MIT
    July 2015

With this new evidence in hand, the case suddenly looked quite different. From the wireless carrier, Arnold saw that someone—presumably Godfrey—had sent a text from the iPhone at a certain time. But the recipient of that text had used a disposable “burner” phone not registered under a true name. So who was it? The iPhone itself had the crucial clue. Arnold could see that Godfrey referred to the person by a nickname. People who knew Godfrey helped police identify the man who went by that nickname. It was not the man who was originally arrested. It was Rafael Rosario—who also appeared in the apartment surveillance footage. Rosario confessed and later pleaded guilty.

What would the outcome have been if Godfrey had been killed today, now that Apple has tightened the security on iPhones so that it can no longer get data from them when police come calling? Digital evidence that remains in abundance—revealing, for example, when and where devices were used—probably would have been enough to show that the original suspect did not kill Godfrey. But he might have sat in jail longer before being cleared, and the actual killer might never have been found.

“Without having had access to the contact list in the devices and the content of text messages, we would have been left, I think, with just surveillance footage,” Arnold says. “We may have had to hang our hopes on the ability of someone to come forward and say ‘I know who that is.’”

We’re about to find out whether this is still the “golden age of surveillance,” now that more and more people carry around a device that is inaccessible by default.

Are we certain we want to eliminate an important source of evidence that helps not only cops and prosecutors but also judges, juries, and defense attorneys arrive at the truth? That essential question got lost in this winter’s remarkable confrontation between the FBI and Apple, when the company refused to help the agency pierce an iPhone that Syed Rizwan Farook had used before he and his wife shot 14 people to death and wounded 22 others in San Bernardino, California. In that case, the perpetrators wouldn’t be standing trial—they were dead. Investigators already had reams of information on them, including files that Farook’s phone had backed up to Apple’s iCloud service. A friend accused of buying guns for them had been arrested. And then the FBI found a way into the device after all.

The issue looks much different in crimes like Godfrey’s murder. In cases like that, local cops who have much scarcer resources than the FBI can find themselves with little to go on. Evidence that once could be found inside cameras, notepads, address books, calendars, and ledgers often now exists only on phones. And on the phones discovered in Godfrey’s apartment in 2010, there was enough evidence to help clarify that one man did not commit murder and another man did.

The contact list and text messages on an iPhone found in this New York City apartment in 2010 offered crucial evidence of who had killed the owner of the device.

The argument for opening smartphones to law enforcement is not that we should make police work as easy as possible. In a free society, some criminals will always slip away because of restraints on investigation that are necessary for balancing liberty and security. Evidence is always lost to time, to decay, to confusion, to incompetence, and to murky memories. We will always keep secrets in safes, in encrypted files, and in our minds.

But we need to ask whether too much evidence will be lost in smartphones that now lock away all that they hold—not just message traffic but also calendar entries, pictures, and videos—even when police have a legal right to view those contents. Apple will eventually close the hole that the FBI found into the San Bernardino phone, and now it is exploring ways of cutting off its avenue for giving police data backed up in the cloud, too. What if these new layers of secrecy undermine the justice system without even increasing your privacy very much?

Sea change

When FBI director James Comey and other law enforcement leaders warned in public forums in 2014 that new layers of encryption on smartphones were causing criminals to “go dark,” it had a familiar ring of hyperbole. Twenty years earlier, American officials were so worried about criminals cloaking their misdeeds that they sought to force companies wanting to encrypt data to use a Clipper Chip, a piece of hardware designed by the National Security Agency to let the authorities unlock data with a digital skeleton key. The Clipper Chip deservedly died after it became clear that the requirement was unworkable and the chip was hackable. And even without it, investigators still managed to prosecute plenty of criminals, thanks in no small part to technologies such as security cameras, location tracking through cell-phone towers, wiretaps on phone calls, and e-mail and text messages that most people did not bother to encrypt and thus could be gathered under a court order.

Digital evidence was in fact so plentiful that by 2011, privacy scholars Peter Swire and Kenesa Ahmad declared our times “the golden age of surveillance.” They and other privacy advocates didn’t dispute law enforcement officials’ contention that terrorists, cartel bosses, and pedophiles were covering their tracks with encrypted messaging services. But such losses to law enforcement, Swire and Ahmad wrote, were “more than offset by surveillance gains from computing and communications technology.”

We’re about to find out whether that will remain true as more and more people do their computing and communicating on a device that is entirely inaccessible to police by default and is wiped clean if anyone makes too many attempts to guess the passcode, which is now the only way to unlock the phone. Since Apple redesigned its iOS operating system in 2014 so that it could no longer open iPhones and iPads—and Google followed suit on some Android devices—phones have started to pile up uselessly in evidence rooms. They are not just hindering investigations of certain clever criminals; they’re inhibiting every kind of case.

In Baton Rouge, Louisiana, investigators are stuck with about 60 locked devices, including an iPhone belonging to Brittney Mills, a 29-year-old woman who was shot to death at her doorstep one night in 2015, when she was eight months pregnant. The baby was delivered but died soon after. Mills had opened the door for the killer, so she probably knew the person. But police could find little physical evidence, and no eyewitnesses have come forward, according to Hillar Moore III, the East Baton Rouge Parish district attorney. Mills’s nine-year-old daughter was home at the time, but she only heard the shooting and then hid in the bathroom. Police know which people Mills texted and called the day of her death, but they don’t know what was said, and Moore says no suspects stand out in that group.

Mills’s iPhone could offer vital evidence. Her relatives have told investigators that she kept a diary on the device. But she hadn’t backed her phone up to Apple’s iCloud service for three months. No one knows her passcode, and Apple says it can’t open the phone. Whether it holds the kinds of clues that cracked Devon Godfrey’s murder in 2010 might never be known.

“It’s one thing for the FBI to be in an arms race with the cell-phone companies. It’s another thing for Onondaga County, New York. What are they going to do?”

At least for now, however, there have not been enough such cases to convincingly show that the “golden age of surveillance” is ending. “We don’t see a groundswell of disruption in law enforcement,” says Paul Rosenzweig, a security consultant who worked on a report on the effects of encryption that was published in March by the Chertoff Group, which advises businesses and governments. He argues that the vast majority of cases that local cops handle do not hinge on evidence now trapped inside cell phones. “It may well be that three years from now, we’ll all look up and say that there’s been a sea change,” he says. “I suspect not, honestly.”

Rosenzweig echoes many computer scientists who say that if the data on phones is encrypted in such a way that the passcode is the only decryption key, identity thieves and other criminals who might get access to your device will be foiled, a benefit that outweighs the negative consequences. As phones gain functions well beyond communications and are used for storing medical information or making payments, removing layers of security (or declining to add more) would be a disaster, says Hal Abelson, a computer scientist at MIT.

These protections are in fact so necessary, Abelson says, that it’s just too bad if they cause problems for the police—they will have to adapt. He brings up a report published last fall by Cyrus Vance Jr., the district attorney of Manhattan. Vance says that since Apple stopped being able to get into its devices, his office has been unable to carry out more than 215 search warrants for iPhones and iPads, in cases that include homicide and sexual abuse of children. Abelson argues that Vance’s office should try to break into the phones, as the FBI ended up doing in San Bernardino. But he does not think Apple should redesign its devices so it can go back to getting data off them for cops. I asked Abelson to consider the most extreme scenario: what if Vance could show that extracting information from those phones was the only way to solve the cases? Abelson was unmoved. “Tough,” he said.

“Are you crazy?”

In his huge office on the eighth floor of a criminal justice building in lower Manhattan, Cyrus Vance can hear car horns honking even when the windows are closed. One wall has a Richard Avedon portrait of his father, who was Jimmy Carter’s secretary of state, alongside mementos from Seattle, where Vance Jr. and his wife raised their two children while he was in white-collar law practice. The other side of the office speaks to the six years he has been the top prosecutor in the nation’s financial capital, where he has become known for his data-driven approach to law enforcement. There is a Scotland Yard drink coaster on a conference table, near an easel that has been moved out of my sight because, I’m told, it holds notes on iPhone-related cases.

Cyrus Vance Jr., the district attorney in Manhattan, says smartphone makers have not shown that older devices that allowed search warrants to be carried out were significantly less secure.

Vance makes no dramatic claims about “going dark,” preferring a measured, lawyerly form of argument. When I tell him that his statistics on inaccessible iPhones don’t yet impress many computer scientists, he makes a facial expression equivalent to a shrug. “Some people have made the determination that not being able to do the kinds of work we do is an acceptable collateral damage,” he says. “I’m not sure how the individual would respond if someone close to him or her were the victim of a crime and the case might depend on the ability to access a phone. Easy to say, unless it’s you. We deal with a lot of victims. We talk to the people it’s actually happened to.”

To Vance, the right course of action is obvious. He thinks the federal government should pass a law that would require the makers of smartphone operating systems to be able to give data to investigators who come to them with search warrants and devices in hand, as they did before 2014. Vance’s proposal would not restrict you from installing apps that encrypt messages, but you could no longer pick a phone off the shelf that makes everything on it invisible to police.

Although locked smartphones have held up a small percentage of cases, Vance is convinced a law is needed before the number climbs much higher. To illustrate his point, he describes a case from 2012. A man in Manhattan was taking a video on his iPhone when he was shot and killed by someone who threatened to go after the eyewitnesses if they talked. Investigators got that video and convicted the killer. Now imagine that shooting happening today. If the victim made the video on a stand-alone digital camera, it would be fair game for cops with a warrant. Why should they be unable to see it just because he used an iPhone?

“It’s too facile to say ‘It’s the golden age of surveillance, and you shouldn’t be able to get into the phone,’” Vance says. “If your view is that it’s [law enforcement’s] job to investigate fully, to get at the truth so that justice can be determined by the true facts, then you would take my position, which is: ‘Are you crazy? You wouldn’t want law enforcement to have access to what may be the most critical evidence?’”

Why can’t investigators often do what the FBI finally did in San Bernardino, and figure out a way into phones themselves? Vance says it doesn’t serve justice very well to make law enforcement continually scramble to find arcane and expensive ways of catching up with Silicon Valley. He also finds it unrealistic. “It’s one thing for the FBI to be in an arms race with the cell-phone companies. It’s another thing for Onondaga County, New York, where there’s digital evidence on phones of child abuse, senior abuse, fraud. The DA and the sheriff in that county—what are they going to do?”

Can’t you ask the National Security Agency to get into phones, much as Edward Snowden revealed that the NSA cracked services run by Silicon Valley? “I cannot,” Vance says. (He does not elaborate—but if the spy agency were allowed to help local law enforcement, it would be forced to reveal its methods in court.)

Instead, he says, Apple and Google should stop expecting special treatment not accorded to other corporations. For example, financial institutions had to build complex systems for catching money laundering and other crimes. “Two companies that own 96 percent of the world’s smartphone operating systems have independently decided they’re going to choose where the line between privacy and public safety is to be drawn,” Vance says. “We should ask them to make the same kind of adaptations that we require banks to do.”

Most provocatively, Vance contends that today’s smartphones might not be meaningfully more resistant to hacking than earlier versions. Even before 2014, you were not exactly a sitting duck; you could erase some lost or stolen devices from afar.  Computer security experts insist that any system in which your passcode is the only key is safer than a system that Apple or Google can also open. But in an attempt to understand the trade-offs involved, Vance has written to the top lawyers at those companies, asking them to quantify the improvement. How much safer are phones now? Had any outsider ever managed to hack the companies’ tools for getting data out of devices that police brought to them? Neither company has responded.

Civil liberties

Apple’s legal argument against the FBI boiled down to this: it could not be forced to undo the protections it built into the San Bernardino phone because no law explicitly said it must. The FBI sought to make the company do it under a 1789 law called the All Writs Act, which lets federal courts issue orders to enable the execution of existing laws. Apple counters that this is a path to extreme government overreach, and that it is for Congress, not individual judges, to decide what to require of smartphone makers. Indeed, in February, a judge in a different case found Apple’s argument persuasive and denied federal investigators access to a locked iPhone used by a man who has pleaded guilty to dealing methamphetamine.

Is Apple ultimately fighting to uphold personal privacy? Or is it fighting for the right to sell any kind of phone it thinks its customers want?

So on one thing, at least, Apple agrees with Vance: Congress should act. The question is how. Apple has avoided suggesting what a new law should say, proposing that a “commission or other panel of experts on intelligence, technology, and civil liberties” explore the subject. But Apple CEO Tim Cook seems confident that such a panel would not advocate limits on the implementation of encryption. “There’s too much evidence to suggest that [would be] bad for national security,” Cook told Time magazine in March.

Yet in that same interview, Cook also appeared to concede that the latest smartphones might indeed get in the way of justice. Asked whether he could accept shutting investigators out of phones in the interest of keeping hackers out too, Cook said he could see a way to let the cops in. If investigators “have a problem with you,” he said, perhaps they “can come to you and say, ‘Open your phone.’ … They could pass a law that says you have to do it, or you have to do it or there’s some penalty.”

This kind of “key disclosure” law already exists in several countries, including the United Kingdom. But if Cook is serious when he claims that Apple is “defending the civil liberties of the country” against the U.S. government, his suggestion is about as shortsighted as the Clipper Chip.

For one thing, a law requiring suspects to give up their passcodes would be useless with a locked phone belonging to a murder victim like Devon Godfrey or Brittney Mills. Second, a criminal potentially facing decades in prison would be happy to take a shorter term for contempt of court instead. That probably explains why the U.K. and other countries with key disclosure laws are nonetheless considering laws that would put even more restrictions on encryption than what Vance proposes.

But the most jarring aspect of Cook’s suggestion is that compelling suspects or defendants to reveal their passwords could weaken the protection against self-incrimination embodied in the Fifth Amendment to the U.S. Constitution, written by people who were obsessed with freedom and privacy. It’s not certain that a key disclosure law would be unconstitutional, but multiple U.S. courts have ruled that suspects do not have to reveal passwords because they are “contents of the mind.”

So is Apple ultimately fighting to uphold personal privacy and civil liberties? Or is it fighting for the right to sell any kind of phone it thinks its customers want while other people deal with the negative consequences? If it’s the latter, that’s understandable; like any public company, Apple is obligated to maximize its value to its shareholders. But society is not necessarily best served by letting Apple make whatever phones are optimal for its chosen business strategy, which is to create a shiny mobile vault that people will trust with every aspect of their lives.

“Letting a company lead the debate is a really bad idea,” says Susan Hennessey, who studies national security and governance issues at the Brookings Institution and who has criticized Apple for taking an “anti-communitarian” stance. “A company is not able to take into account the full scope of what our values are.”

It’s very possible that locking cops and prosecutors out of smartphones won’t interfere with justice as much as they fear. They might find ample methods of adapting, especially as new technologies arise. But just because some officials have overreacted to encryption in the past doesn’t mean we should brush off warnings coming now. The justice system is far from infallible, but it is run by people whose duty is to something more than a set of shareholders.

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.