Skip to Content

Harnessing Quantum Bits

Computers that can simultaneously process information in numerous alternate realities are less theoretical than you might think.

The achievement may not rank up there with Samuel Morse’s transmitting “What hath God wrought” from Washington, DC, to Baltimore in 1844 or Alexander Graham Bell’s voice intoning, “Watson, come here. I want you” from one room to another in 1876. Nevertheless, scientists may eventually mark as a milestone the day in 2001 when Isaac Chuang and his colleagues at IBM determined that the two prime factors of the number 15 are three and five.

What made their calculation remarkable, of course, wasn’t the grammar school arithmetic, but that the calculation had been performed by seven atomic nuclei in a custom-designed fluorocarbon molecule. The irony that an experiment so complex and delicate would yield a result so pedestrian and mundane is not lost on Chuang, one of the world’s most prominent researchers in quantum computing. “My group,” he says with a chuckle, “holds the world record for the largest and most useless quantum computer.”

But Chuang, now an associate professor at the MIT Media Lab, might be showing an excess of humility. Quantum computers exist today only on a painfully small scale. But despite a slow start, the field seems to be on the verge of yielding real advances in quantum theory and engineering. Researchers have proposed the first designs for large-scale quantum computers, devices that use the bizarre properties of subatomic particles existing at the extremes of small size and high speed to solve problems that confound even the most powerful conventional computing devices.

One engineering approach with considerable promise employs a class of devices that can trap individual electrons within an electromagnetic field. Their “spin,” or orientation in a magnetic field, can be observed to produce a quantum bit, or “qubit.” Another promising approach uses nuclear magnetic resonance, which can manipulate collections, or “ensembles,” of molecules to perform computations and return results in a measurable form. This is the technique Chuang and Neil Gershenfeld, a fellow MIT Media Lab professor, are exploring. “People are coming up with all these tools that will make a quantum computer easier to make,” says Jonathan Dowling, principal scientist and supervisor of the quantum-computing technologies group at NASA’s Jet Propulsion Laboratory in Pasadena, CA.

This research is also starting to indicate near-term spinoffs, including improvements in electronic controls for navigational, communications, and measurement devices. “One of the things I do is think of ways to use quantum computing to make better gizmos,” says Dowling. Among his projects is a quantum gyroscope that would exploit the quantum behavior of photons to make these crucial navigation devices more sensitive. The excitement has spread even to the artificial-intelligence community: there are signs that the ability of quantum algorithms to probe multiple possibilities simultaneously might help in the mining of large-scale databases, one of the field’s most important practical goals. If the geographic positioning systems, mobile phones, search engines, and integrated circuits of the future are vastly more precise or reliable than today’s, it may be the result of trailblazing quantum-computing efforts under way right now in labs around the world.

The electronics industry, naturally, has taken notice. IBM sponsors quantum-computing research at its Almaden Research Center on the outskirts of Silicon Valley, where Chuang performed his initial work, as well as at its flagship Thomas J. Watson Research Center in Yorktown Heights, NY. Hewlett-Packard supports quantum-computing research at its labs in Palo Alto, CA, and Bristol, England. And the semiconductor industry, which has come of age along with advances in the electronics of classical computing, is keeping a sharp eye on developments in the field, the better to be prepared for the post-Moore’s Law era, when miniaturization of classical electronic circuits bumps against physical limits. Experts predict that will happen sometime within the next two decades.

“Progress has been slow, but it’s been steady,” says David P. DiVincenzo of IBM’s Watson lab. “Two or three years ago some significant efforts were started that are beginning to pay off.”

Multiple Realities

Whatever the technological approach, the goal of quantum computing is to take advantage of the very quality of quantum mechanics that designers of classical computers try mightily to avoid: its pure weirdness.

The logic circuits of traditional computers, for example, work by detecting discrete differences in the voltage passing through electronic gates: a high voltage indicates a binary one and a low voltage indicates a zero. Binary information can be represented also by the spin of an electron or the polarity of a photon. But because these particles exist in the quantum realm, which is marked by infinitesimally short distances, nearly speed-of-light velocities, and extremely low energy levels, their states can’t always be discerned as strictly one or zero. Electrons and photons can behave like waves rather than particles, appear to occupy more than one place at a time, and simultaneously exhibit incompatible states: an electron’s spin, for instance, can be both “up” and “down,” a condition that quantum mechanics calls superposition. In computing terms, the qubit can be one and zero simultaneously.

The second phenomenon important to quantum computing is “entanglement,” in which two or more particles are created with mutually dependent properties: when, for example, a single photon is turned into two complementary particles by an optical beam splitter. Measuring the properties of one particle instantaneously determines the state of the other, even if at the moment of measurement the two are separated from one another by galactic distances. (Einstein famously derided this phenomenon as “spooky action at a distance.”)

It has long been believed that these phenomena could be employed to solve computational problems beyond the reach of traditional technology. A single quantum bit that can be in two states at once can do the job of two classical bits operating in parallel. (Only when the particle is measured or observed do all the possibilities resolve into a single, classical reality.) Two entangled qubits, meanwhile, can simultaneously evaluate four inputs. Put another way, a traditional memory register with eight bits can store only one of a possible 28, or 256, digital “words,” but a quantum register with eight qubits can represent and compute with all 256 words at once.

Such general notions were articulated by Richard Feynman, among others, as long ago as the 1980s. For years, however, no one had a clear idea how they might be applied to real problems. In 1994 mathematician Peter W. Shor of AT&T Bell Labs described an algorithm, or program, that proved a quantum computer could factor large numbers exponentially faster than any known conventional method.

Shor’s discovery was important because factoring is exactly the kind of problem that overwhelms conventional computers: As a number grows larger, the resources required to factor it expand rapidly. To factor the number six is trivial, but experts estimate it would take all the supercomputers in the world longer than the age of the known universe to find the factors of a number with 300 digits.

Factoring, moreover, is a mathematical problem with real-world applications. The very intractability of large-number factoring is at the core of modern cryptography, which relies on it to create unbreakable keys. Shor’s algorithm represented both a dagger in the heart of old-style “unbreakable” codes and a signpost pointing toward a new class of truly unbreakable codes. Indeed, many quantum-computing experts predict that quantum cryptography will be the first commercial application to emerge from the fledgling science, and already at least three companies have been formed to market secure communications systems based on quantum factoring (see “Quantum Cryptography,” TR February 2003).

Beyond Secure Codes

at the same time, shor’s work provoked a surge of interest in other possible applications of quantum computing. “After the first result from Shor,” says computer scientist Wim van Dam of HP Labs, “everybody was very optimistic that we’d find lots and lots of algorithms” for which quantum computing would be useful. But through most of the 1990s, such “killer apps” remained elusive. For a time, scientists feared that the factoring problem would be quantum computing’s only payoff.

In the last two years, however, an improved understanding of how qubits work has stimulated a new search for problems qubits are particularly well suited to solve. Edward Farhi of the Center for Theoretical Physics at MIT is among those who have undertaken that work. “The whole focus of our interest,” Farhi says, “is if you had a perfectly functioning quantum computer, what would you do with it?”

Farhi and his colleagues have identified a range of general, if abstract, computations quantum computers can complete far faster than classical computers. One is a navigation problem in which a traveler with neither map nor guide moves from starting point to destination via random paths that branch from a given number of way stations. Surprising nobody, Farhi’s team showed that under such conditions, the time a classical computer requires to find a way from point A to point B expands exponentially as the number of branching points between A and B increases. In contrast, a quantum computer would travel all possible paths at once, reliably finding its way across the maze in a time frame that expands only arithmetically with the complexity of the maze.

“I’m kind of pleased with this thing even though it’s a bit artificial,” Farhi says. What distinguishes Farhi’s problem from the famous “traveling salesman” problem and other logistical conundrums is that the traveler has no map and therefore only limited knowledge of the pathway. But in the real world, a traveler is likely to have a map showing all possible paths, and the challenge is to find the most efficient route. All a quantum computer does is traverse all paths at once to quickly arrive at the destination-but it won’t produce a particular route. So it isn’t yet clear whether or not Farhi’s work points to a practical application. “This is a steppingstone to a better example,” Farhi says hopefully. “I guess you could say we’re looking for areal-world example.”

A more likely use of quantum computing may be in database searches. In 1996 Lov Grover, a physical-sciences researcher at Lucent Technologies’ Bell Labs, developed an algorithm that showed how quantum computing could vastly accelerate searches: A classical computer searching a phone book with, say, one million entries would average some five hundred thousand tries to find a single specified phone number. A quantum computer would need only one thousand tries.

“The point is,” Farhi says, “quantum speedup is not universal. So finding problems for which it’s advantageous is an art.”

Qubit Trappers

While farhi and his colleagues are determining what can be done with a quantum computer, others are hard at work engineering the hardware itself.

IBM’s DiVincenzo says a practical quantum computer must have five fundamental capabilities: It must provide for qubits-particles or groups of particles that can be isolated and placed in superposition, the indeterminate state in which they represent both ones and zeroes. It must be possible for operators to control the initial states of the qubits, analogous to setting them all to zero at the outset of a computation. The qubits must remain stable-in superposition long enough to perform an operation-anywhere from milliseconds to several seconds. It must be possible to implement quantum logic circuits that correspond to such Boolean operators as and, or, and not, which form the basis of traditional computer architecture. In classical computers these expressions are embodied in electrical circuits. The simplest logic gate, the not gate, converts an incoming digital one into a zero, and vice versa. To manipulate qubits, quantum circuits will have to employ techniques such as extremely precise control of magnetic fields or laser pulses.

The final requirement of a quantum computer is that it make the results of a calculation accessible to the user, through, for example, a visual readout.

Most quantum-computing experiments boil down to efforts that address one or more of DiVincenzo’s requirements. “There are probably a half-dozen serious proposals and 10 times that number that are not serious,” says Bruce Kane, who specializes in the science of single-electron devices at the University of Maryland.

Chuang and Gershenfeld, for example, used nuclear magnetic resonance to measure the spin of qubits in bulk materials-a vial containing a billion billion molecules custom-made from fluorine, carbon, iron, hydrogen, and oxygen. The spins of the nuclei of the five fluorine and two carbon atoms in each molecule functioned as interacting qubits to execute Shor’s algorithm. Although Chuang and Gershenfeld’s achievement in controlling and measuring the spins of seven qubits has been widely hailed, many in the field believe that scaleup of this approach will be extremely difficult. “The limitation is that every time you add qubitsthe signal-to-noise ratio decreases,” says Kane, referring to the amount of useful information-such as the excess of particles with one spin over particles with a different spin-that can be distinguished from random disturbances in the bulk fluorocarbon material.

Chuang himself acknowledges that his seven-bit quantum computer falls far short of the scale needed for meaningful computations. “To make it practical, we’ll have to get to thousands, if not hundreds of thousands, of qubits,” he says. A rival approach that uses nanoscale engineering techniques to build qubit containers, he adds, may be easier to scale up.

David J. Wineland and his team are studying this alternative at the Time and Frequency Division of the U.S. National Institute of Standards and Technology in Boulder, CO. They’re building miniature devices using electrodes that isolate ions in “traps” fashioned from electric fields. The virtue of this approach, Wineland says, is that ion traps are relatively easy to fabricate, may be linked together, and can hold more than one ion per trap. Wineland suggests that a string of ions confined in a single trap might function as a sort of quantum memory, and each additional qubit would expand storage capacity exponentially. Already, Wineland’s group has coaxed such qubits to stay in a state of superposition for up to 10 minutes. But one current weakness in this scheme is that it’s difficult to transfer quantum information between ions held in separate traps, a necessity for large-scale computations.

Quantum Spinoffs

creating qubits that interact and stay in superposition long enough to make themselves useful will preoccupy quantum-computing researchers for years to come. Still, practical payoffs are emerging as scientists exploit the phenomena behind quantum computing in related fields.

At NIST, whose institutional mission includes setting standards for the measurement of time, Wineland’s interest in quantum computation predated even Shor’s algorithm. “We were starting to think up ways quantum entanglement could be used to improve the signal-to-noise ratio in atomic clocks,” he explains. “We knew there was a quantum entanglement state that could improve the clock, and the ideas of quantum computation showed how to make it.” Roughly speaking, today’s atomic clocks work by taking the average of simultaneous readings of the oscillating magnetic fields of more than a million cesium atoms; quantum entanglement could reduce the time needed to compute that average and enhance precision by allowing many readings to be taken concurrently.

The Jet Propulsion Laboratory’s Dowling adds that quantum entanglement may provide a better way to synchronize earthbound clocks with those in space. At present, ground-and-space synchronization, which is most often done by radio, is thrown off, albeit minutely, by atmospheric refraction and other effects. Because entangled photons are linked on a quantum level, they are immune to these physical disturbances. “It would be a really big deal to knock those [effects] out,” Dowling says. He suggests sending entangled particles to the sites to be synchronized. Measuring one particle would instantaneously set the other one to “ticking,” Dowling says. Having calibrated their clocks to the ticking particles, operators would know the clocks were in agreement.

Lest anyone think that quantum-scale refinements of time measurement are of academic interest only, it should be noted that atomic timekeeping is the basis of geographic positioning systems, satellite tracking technologies, and mobile communications networks, which are synchronized by the second. “History has shown forever that whenever there’s a better clock, it gets used,” Wineland says. “It’s a good bet that trend will continue.”

Scientists in industry, meanwhile, are looking for ways to bootstrap quantum computing by linking it with conventional technologies in which they have more experience. Last year Hewlett-Packard forged a $2.5 million working alliance with Gershenfeld and Chuang to, as HP Labs senior scientist Philip Kuekes says, “combine our respective expertise.” HP is intrigued, for instance, by the possibility of transmitting quantum bits via ordinary fiber-optic lines-thousands of kilometers of which lie installed but underutilized around the country. “That’s actually quite interesting,” Kuekes says. The long-distance transmission of quantum information, enhanced by the characteristics of quantum entanglement, would allow correspondents to share code keys without fear of their being compromised. That means, he adds, that “one of the things that might happen quite early is quantum cryptography.”

Although, as research has shown, qubits can be transmitted over fiber-optic lines, the transmissions work for no more than tens of kilometers at a time. Sending qubits across continents or oceans, Kuekes says, would require a system of quantum switches and repeaters analogous to the solid-state versions that help move data throughout the Internet. These would amount to simple quantum computers equipped with error-correcting software that could compensate for the inevitable loss of superposition among many of the traveling qubits. Development of this software is one of the main thrusts of HP Labs’ research.

In a scientific example of the child’s being father to the man, applied research has thrown off some ancillary benefits even in the parent science of quantum mechanics. The tools needed to perfect quantum computing, it turns out, also help demonstrate particle behavior that physicists, until now, have posited only in theory.

“There’s a beautiful flowing-back the other way,” says John Preskill, a professor of theoretical physics at Caltech. “Interest in quantum computing has inspired a lot of interesting science. We’re a long way from a crash program in engineering, but we’re entering a new era in condensed-matter physics.”

This is largely a result of quantum computing’s requirement that qubits be controlled and measured with unprecedented precision. “The tradition in condensed-matter physics has been to do experiments on ensembles,” that is, huge quantities of atoms whose quantum behavior can be identified statistically, Preskill says. “You don’t ordinarily measure the behavior of single electrons.”

In particular, Wineland’s experiments at NIST, Preskill says, have given physicists an unparalleled window on the behaviors of individual particles. The breakdown of qubits into classical ones or zeroes, for example, is a phenomenon that, in the past, scientists could only infer by observing whole clouds of electrons or photons. The clouds’ average signal would indicate whether some particles had changed their quantum state, “but you wouldn’t really see the individual particle behavior,” Preskill says. “It really is a new type of experiment.”

Preskill, like others in the field, cautions that many questions need to be answered and critical problems resolved before quantum computing can advance beyond its current elementary applications. “Whether this field will still look this exciting in 10 years, I can’t say,” he admits. “But for now, the field feels fresh and new.” Once again.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.