Whatever the technological approach, the goal of quantum computing is to take advantage of the very quality of quantum mechanics that designers of classical computers try mightily to avoid: its pure weirdness.
The logic circuits of traditional computers, for example, work by detecting discrete differences in the voltage passing through electronic gates: a high voltage indicates a binary one and a low voltage indicates a zero. Binary information can be represented also by the spin of an electron or the polarity of a photon. But because these particles exist in the quantum realm, which is marked by infinitesimally short distances, nearly speed-of-light velocities, and extremely low energy levels, their states can’t always be discerned as strictly one or zero. Electrons and photons can behave like waves rather than particles, appear to occupy more than one place at a time, and simultaneously exhibit incompatible states: an electron’s spin, for instance, can be both “up” and “down,” a condition that quantum mechanics calls superposition. In computing terms, the qubit can be one and zero simultaneously.
The second phenomenon important to quantum computing is “entanglement,” in which two or more particles are created with mutually dependent properties: when, for example, a single photon is turned into two complementary particles by an optical beam splitter. Measuring the properties of one particle instantaneously determines the state of the other, even if at the moment of measurement the two are separated from one another by galactic distances. (Einstein famously derided this phenomenon as “spooky action at a distance.”)
It has long been believed that these phenomena could be employed to solve computational problems beyond the reach of traditional technology. A single quantum bit that can be in two states at once can do the job of two classical bits operating in parallel. (Only when the particle is measured or observed do all the possibilities resolve into a single, classical reality.) Two entangled qubits, meanwhile, can simultaneously evaluate four inputs. Put another way, a traditional memory register with eight bits can store only one of a possible 28, or 256, digital “words,” but a quantum register with eight qubits can represent and compute with all 256 words at once.
Such general notions were articulated by Richard Feynman, among others, as long ago as the 1980s. For years, however, no one had a clear idea how they might be applied to real problems. In 1994 mathematician Peter W. Shor of AT&T Bell Labs described an algorithm, or program, that proved a quantum computer could factor large numbers exponentially faster than any known conventional method.
Shor’s discovery was important because factoring is exactly the kind of problem that overwhelms conventional computers: As a number grows larger, the resources required to factor it expand rapidly. To factor the number six is trivial, but experts estimate it would take all the supercomputers in the world longer than the age of the known universe to find the factors of a number with 300 digits.
Factoring, moreover, is a mathematical problem with real-world applications. The very intractability of large-number factoring is at the core of modern cryptography, which relies on it to create unbreakable keys. Shor’s algorithm represented both a dagger in the heart of old-style “unbreakable” codes and a signpost pointing toward a new class of truly unbreakable codes. Indeed, many quantum-computing experts predict that quantum cryptography will be the first commercial application to emerge from the fledgling science, and already at least three companies have been formed to market secure communications systems based on quantum factoring (see “Quantum Cryptography,” TR February 2003).