Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

A New Kind of Bit

Actually, when quantum computing started out, it was a fanciful theorist’s dream-the late Richard Feynman’s, most notably. The famously whimsical Nobel laureate first started pushing the idea in 1981, following an earlier suggestion by Argonne National Laboratory physicist Paul Benioff. Others quickly followed Feynman’s lead. And by the time Ike Chuang first encountered quantum computing in the late 1980s, shortly after receiving his undergraduate physics degree from MIT, a cadre of at least half a dozen physicists and computer scientists were actively working in the field.

Certainly Chuang was smitten. “I’ve always wanted to know what information is in a physical sense-and to understand what physics is in terms of information,” he says. Quantum computing seemed like a completely new way to understand them both.

Take this business of encoding information, he says: The subatomic world is full of yes-no choices that make it downright easy. Most particles-including electrons, protons and even the ephemeral packets of light called photons-possess a kind of built-in rotary motion known as “spin.” So if your subatomic computer used electrons, for example, you could say that an electron spinning in one direction represented a binary 1, while an electron spinning in the opposite direction represented a 0. And once you encode the information, says Chuang, the subatomic world also offers any number of ways to process it. By manipulating the magnetic environment of electrons, say, or by routing photons through an array of polarizers, mirrors and beam splitters, you could subject your quantum bits to all the operations required by a digital computer.

But there would also be a critical difference. Conventional computers followed the rules of binary logic, governed by an ironclad either-or distinction: Each bit of information is either true or false, on or off, one or zero. To enforce this distinction, conventional machines represent each bit as the presence or absence of a few zillion electrons collected in a tiny silicon transistor, so that the zillions are either there or they are not.

But once you get down to the level of individual particles, says Chuang, almost nothing is absolute. An electron, for example, could be spinning one way or the other-but it could also exist as a kind of mixture of spins. According to the laws of quantum physics, you could say the electron has a probability of spinning one way or the other. But unless you actually made a measurement and forced the issue, you couldn’t know which it was; in a sense, the electron itself would be undecided. And that, in turn, means that each bit of quantum information could be undecided. Instead of being either-or, a quantum “qubit” could be both-and: representing 0 and 1 simultaneously.

This ambiguity has a powerful consequence that becomes more apparent when you think of not one but two qubits. These qubits could simultaneously exist as a combination of all possible two-bit numbers: (00), (01), (10) and (11). Add a third qubit and you could have a combination of all possible three-bit numbers: (000), (001), (010), (011), (100), (101), (110) and (111). This kind of system scales exponentially: n qubits can stand for 2n numbers at once. Line up a mere 40 qubits, and you could represent every binary number from zero to more than a trillion-simultaneously.

Furthermore, says Chuang, just as a collection of quantum qubits could represent a huge array of numbers simultaneously, a quantum computer could process every possible input simultaneously-the most perfect form of parallel processing imaginable. Given the right kind of problem and a sufficient supply of qubits, a quantum computer could outpace its conventional counterparts by many orders of magnitude. “I read about all this,” says Chuang, remembering his first encounter with Feynman’s articles around 1987 or so, “and from then on, I wanted to build a quantum computer.”

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me