Skip to Content

Technology Can Make Lawful Surveillance Both Open and Effective

With cryptography, surveillance processes could be open and preserve privacy without undermining their investigative power.

Democracy rests on the principle that legal processes must be open and public. Laws are created through open deliberation by elected bodies; they are open for anyone to read or challenge; and in enforcing them the government must get a warrant before searching a person’s private property. For our increasingly electronic society to remain democratic, this principle of open process must follow us into cyberspace. Unfortunately it appears to have been lost in translation.

The NSA, secretly formed after World War II to spy on wartime adversaries, has clung to military-grade secrecy while turning its signals-intelligence weapons on ourselves and our allies. While nominally still a “foreign-intelligence” agency, the NSA has become a de facto law-enforcement agency by collecting bulk surveillance data within the U.S. and feeding these data to law-enforcement agencies. What walks like a duck and squawks like a duck is usually a duck, and since the NSA has been squawking like a law-enforcement agency, it should be subject to open processes like a law-enforcement agency.

Other agencies have also caught secret surveillance fever. Arguing that phone or Internet users have no expectation of privacy, the FBI secretly uses warrantless subpoenas to obtain bulk cell-tower records affecting hundreds of thousands of users at once, whether investigating bank robberies or harmless urban pranks. Police spy on entire neighborhoods with fake cellular base stations known as “StingRays” and have deliberately obfuscated warrants to conceal their use of the technology.

All this secrecy—and its recent partial unraveling—has harmed our democracy and our economy. But effective surveillance does not require total secrecy. With a policy and technology framework that our team and others have developed, surveillance processes could be made open and privacy-preserving without compromising their effectiveness. Details will be presented today in our paper “Catching Bandits and Only Bandits” at the Workshop on Free and Open Communications on the Internet.

We propose an openness principle—something we believe is necessary to constrain electronic surveillance in a healthy democracy. In brief, any surveillance process that collects or handles bulk data or metadata about users not specifically targeted by a warrant must be subject to public review and should use strong encryption to safeguard the privacy of innocent users. Only after law-enforcement agencies identify people whose actions justify closer investigation and demonstrate probable cause via an authorized electronic warrant can they gain access to unencrypted surveillance data or employ secret analysis processes. The details of an investigation need not be public, but the data collection process would be—what information was collected, from whom, and how it was encrypted, stored, searched, and decrypted. This is no different in principle from the way the police traditionally use an open process to obtain physical search warrants without publicly revealing the target or details of their investigation.

Technology we have developed could allow law enforcement to enact this approach without hampering their work. In fact it could even enhance it. As we have argued before and have now demonstrated, modern cryptography could enable agencies to find and surgically extract warrant-authorized data about persons of interest like needles in a haystack of encrypted data, while guarding both the secrecy of the investigation and the privacy of innocent users whose data comprise the haystack. The NSA was aware of this option but, shielded from public scrutiny, chose a more invasive path. Our design ensures that no sensitive data may be decrypted without the use of multiple keys held by independent authorities, such as the law-enforcement agency, the authorizing judge, and a legislative oversight body. 

Our approach can target not just known but unknown users. In the case of bank robbers known as the High Country Bandits, the FBI intercepted cell-tower records of 150,000 people to find one criminal who had carried a cell phone to three robbery sites. Using our encrypted metadata search system, the FBI could have quickly extracted the bandit’s number without obtaining data on about 149,999 innocent bystanders. The same system could discover unknown associates of known targets. This and many other cryptographic methods could facilitate the legitimate pursuit of criminals and terrorists while protecting our privacy.

Secrecy-obsessed agencies will fret that open processes like those we propose might help terrorists evade surveillance. But it’s better to risk a few criminals being slightly better informed than to risk the privacy and trust of everyone. When intelligence leaders lie to Congress and spy on their overseers, we must ask whether the existential threat to our society is hiding in rocky caves or in Beltway offices. With the right technology, we can have both strong national security and strong privacy.

Bryan Ford is an associate professor of computer science at Yale University, where he leads the Decentralized/Distributed Systems research group.

Joan Feigenbaum is Grace Murray Hopper professor and chair of the computer science department at Yale University.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.