Study Shows Flawed U.S. Encryption Standard Could Be Broken in Seconds
The security of a data connection protected using a flawed U.S. encryption standard promoted by the National Security Agency could be broken in under 16 seconds using a single computer processor. That’s according to the first in-depth study of how easily encryption systems that use the now deprecated Dual_EC random number generator could be defeated by an attacker that had “backdoored” the standard.
The flawed standard has never been widely used to protect Internet communications, even though the security company RSA got $10 million from the NSA to make it the default random number generator in one of its software packages. It is not known whether the NSA or anyone else knows the crucial mathematical relationship needed to exploit the flaw and undo encryption based on Dual_EC.
However, the study conclusively shows that an attacker that did know the key to the Dual_EC backdoor could put it to practical use. Not all of the six different encryption software packages tested could be defeated in seconds: half took a 16-processor cluster between 60 and 80 minutes of work to break. But a national intelligence agency could significantly improve on those times by devoting more computing power to the problem.
Documents leaked by Edward Snowden, and published in September 2013, do indicate that the NSA has tried to influence standards on encryption, and to encourage commercial companies to make security products more susceptible to U.S. surveillance. Both the National Institute of Standards and Technology (NIST) and RSA withdrew their endorsement for Dual_EC after the Snowden documents were published last year.
The new study was carried out by researchers from Johns Hopkins University, the University of Wisconsin, the Technical Univesity of Eindhoven, the University of Illinois at Chicago, and the University of California San Diego.
NIST first proposed Dual_EC in 2006. Months later two researchers from Microsoft found a mathematical flaw that resembled an intentional “backdoor” that could be used to undo encryption based on the standard.
The weakness centers on two constants, known as P and Q, that function as kind of default settings for the generator and are supposed to be randomly chosen and unrelated to one another. However if there is some mathematical relationship between the two, it can be used to predict the output of the generator based on seeing one of its past outputs.
Some security experts have long suspected that the versions of P and Q in NIST’s version of Dual_EC are linked in some way, and that the NSA knows exactly how, allowing it to undo encryption based on the standard. Those fears gained credence in light of the fact that the Snowden documents showed that the agency did have a policy of trying to influence new standards.
To test what a key to the backdoor in Dual_EC might allow, the researchers set values of P and Q that were linked. They then played the role of an attacker trying to break encrypted TLS connections made by software in use today that supports Dual_EC or once used it by default. TLS connections are widely used to secure Internet data, such as Web browsing, e-mail, and VoIP.
RSA’s two implementations of Dual_EC, both of which used to have it as the default random number generator, proved to be the easiest to break. A version written in the C programming language could be undone in under 16 seconds using a single computer processor, and under three seconds using a computing cluster with 16 processors. A version of RSA’s software written in Java took the cluster around an hour, about the same as one version of Microsoft’s Schannel security software.
That variation in susceptibility was mostly caused by seemingly minor implementation choices made by different software developers. However, the Java version of RSA’s software could be further weakened by enabling an NSA-backed tool, Extended Random, bundled with the software. Turning on that feature sped up the work of backdooring TLS connections by 65,000 times. Extended Random was proposed as a standard to the Internet Engineering Standards Task Force by the NSA and others in 2008, after which RSA added it to some of its software. However few other companies did, and it was dropped from the standardization process.
Keep Reading
Most Popular
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.