An iphone image showing Face ID set up screen with Justice John Roberts's face
AP Photo/J. Scott Applewhite; Apple

Computing / Cybersecurity

Why smartphones’ “cop mode” might not keep cops out for much longer

The debate over “compelled decryption” is likely headed for the US Supreme Court.

An iphone image showing Face ID set up screen with Justice John Roberts's face

I opened the YouTube app last night to watch a video about the anti-government protests in Hong Kong. A clever five-second ad from Apple preceded it. “This is how long it takes for FaceID to unlock your phone,” the commercial said. The actor smiled, happily surprised as he successfully unlocked his phone just by looking at it.

The video then switched abruptly to Hong Kong, where local cops have been prying open protestors’ shut eyes so that FaceID will unlock their smartphones, giving the cops near-instant access to what can be an entire life’s worth of data. Most phones are encrypted and not easily broken into, though companies like Cellebrite have made a profitable business out of hacking into phones at the request of paying government customers. Many of the newest Apple and Android smartphones, however, can respond to their owner’s face and reveal the treasures inside.

American cops aren’t likely to resort to forcing someone’s eyes open. But they can command you to look at a phone or to put your finger on the fingerprint sensor to unlock it, an order known as “compelled decryption.”

Whether they should be able to is a live question right now in US civil liberties law. “There are cases pending on this issue right now in the supreme courts of Indiana, Pennsylvania, and New Jersey,” says Orin Kerr, a professor at Berkeley Law. “I suspect Supreme Court review will come in the next two to three years. The constitutional standard is unsettled, but I suspect it wont stay unsettled for long.”

Back doors, biometrics, and San Bernardino

For years government officials in the US and elsewhere have complained about the “going dark” problem—that as it’s become more common for smartphones and computers to encrypt the data stored on them by default, it’s getting harder for authorities to get hold of that data for crime-solving or anti-terrorist purposes. The debate extends back decades, the 2016 fight between Apple and the FBI over access to the phone of the San Bernardino gunman being a classic recent example.

Government officials have called for companies to build encryption “back doors”— also known by euphemisms like “special access” or “responsible encryption”—into devices like smartphones. Though many details of how these would work are still very much in the ether, they would effectively be keys allowing the government—subject to a court order—to unlock an encrypted device.

Critics, including most cryptographers and information security professionals, say such back doors are inherently insecure. The precise pitfalls depend on the solution, but the general threat is that any back door will inevitably be used by more than just authorized government agents, which could be catastrophic for economic and national security. Attorney General William Barr threw the spotlight back on this debate just last month when he called on Congress to pass a law mandating government back doors to encrypted data (though he offered no new solutions).

Compelled decryption, the argument goes, would be a convenient way for police to get into someone’s device quickly without that back door. At the heart of whether they can legally do so is the US Constitution’s Fifth Amendment, which guarantees the right against self-incrimination.

Before biometrics like FaceID and fingerprint sensors, the only way to unlock a locked phone was with its passcode. Some courts have treated passcodes as “testimony” under the Fifth Amendment, ruling that people cannot be compelled to give them up and potentially incriminate themselves.

Biometrics are looked at differently. Just last month, in a federal case against a child porn suspect, a US district court ruled that forcing him to use his fingerprint to unlock his Google Pixel phone did not violate his Fifth Amendment rights.

Overturning a lower court’s decision, Judge David Nye asserted that there is no testimony in biometrics—indeed, barely any thought at all on the part of the suspect. If “the government agents pick the fingers to be pressed on the Touch ID sensor, there is no need to engage the thought process of the subject at all in effectuating the seizure,” Nye wrote in his decision. “The application of the fingerprint to the sensor is simply the seizure of a physical characteristic, and the fingerprint by itself does not communicate anything. It is less intrusive than a forced blood draw. Both can be done while the individual sleeps or is unconscious.”

Mana Azarmi, a lawyer with the Center for Democracy and Technology, points to the paradoxical situation this creates. “Right now, your data is more secure if you use one form of protection instead of another,” she says. “That requires the user to keep abreast of these issues. The common person doesn’t even always use passwords, but if you ask that common person on the street how courts view passwords versus face or fingerprints, they’d be shocked to hear the difference.”

Which part of the Constitution governs compelled decryption?

Kerr wrote this year in the Texas Law Review that the fight over compelled decryption is focused on the wrong part of the Bill of Rights. He argued that forced unlocking of a phone should be governed by the Fourth Amendment, the protection against unreasonable searches and seizures. If law enforcement passes that test (e.g., by getting a search warrant), then the Fifth Amendment shouldn’t block compelled decryption, whether it’s using a passcode, a fingerprint, face recognition, or any other method. The effect, he wrote, would be to undercut “more draconian” proposals like back doors, by giving police a way in without weakening encryption standards.

Despite the close eye of lawyers and civil liberties activists, even some of Capitol Hill’s key players on privacy and encryption issues have yet to come down on one side or another.

“Senator Warner believes that this is an issue that requires a very nuanced discussion, recognizing the enormous value that encryption has for our national security, and the need to equip law enforcement with tools that allow them to use technology to their benefit, rather than seeing technology as an obstacle,” Rachel Cohen, a spokeswoman for the Virginia Democrat Mark Warner, said. Warner is one of the US Senate’s most active legislators on technology policy.

If compelled decryption is ruled to be legal, there is a fallback for users who want to protect their data. It’s a method that’s already become popular with protesters in Hong Kong. Dubbed “cop mode” by fans of the feature, it involves—on an iPhone—pressing the power button on the right and volume button on the left for five seconds to disable biometric unlocking and require a passcode instead. Google’s Android has a similar feature in its settings. With a few quick button presses, biometrics are turned off.

Passwords can be forced out of you too, of course, but it’s not quite as easy. A password is a secret inside your brain. FaceID is who you are.