Much of today’s data security rests on encryption techniques involving mathematical functions that are easy to perform but hard to reverse, such as multiplying two large prime numbers. Factoring the result without knowing either of the prime numbers is very difficult, at least for the computers we use today; it could take thousands of years for even the most powerful supercomputer to identify the original numbers and crack an encrypted message.
The task would be a snap for a quantum computer, however. (In these computers, a bit need not be just a 0 or a 1; it can exist in an infinite number of intermediate states.) A practical quantum computer is probably at least decades away, but simple demonstrations have already been made. Some researchers warn that such a computer would herald the death of encryption and expose not just new information but also preëxisting material encrypted in the expectation that it would stay secret indefinitely. Imagine if it became possible in 20 years to read electronic medical records being created today.
Alternative systems are being proposed. Lattice-based encryption, for example, relies on the difficulty of determining such things as the shortest vector possible in a given multidimensional lattice; that can be so hard it would stump even a quantum computer. Even though such a computer may not be available for many years, if ever, it’s important to begin developing and deploying better cryptography now so it has time to diffuse throughout cyberspace.
Forget dating apps: Here’s how the net’s newest matchmakers help you find love
Fed up with apps, people looking for romance are finding inspiration on Twitter, TikTok—and even email newsletters.
How AI could solve supply chain shortages and save Christmas
Just-in-time shipping is dead. Long live supply chains stress-tested with AI digital twins.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
How AI is reinventing what computers are
Three key ways artificial intelligence is changing what it means to compute.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.