The quantum computer is the poster-technology for the next generation of computing, mainly because the computational power of such a machine could far exceed that of conventional computers. For instance, a quantum computer could quickly determine all the factors of a 300-digit number – a feat that’s impossible even for all of the world’s supercomputers combined.
Such a capability isn’t just a number-crunching trick either: factoring is fundamental to cryptography – and governments handsomely support research that could keep their information safe.
Despite grant money and the work of corporate, university, and government research groups – at IBM, Hewlett Packard, and the National Institute of Standards and Technology (NIST), to name a few – quantum computing has remained primarily in the lab (see Harnessing Quantum Bits, March 2003). Recently, however, researchers at the University of Michigan have pushed quantum computing closer to the real world with the fabrication of a key component, an “ion trap,” on a common semiconductor chip. This ion trap is so important because it can host a quantum bit, or “qubit,” the most fundamental element in quantum computing. The advance suggests one way that the hardware for quantum computing could be mass-produced.
“We, along with a small handful of other research groups across the world, are working hard to see how far ion trap technology can be pushed,” says Christopher Monroe, professor of physics at the University of Michigan in Ann Arbor, and leader of the ion trap study, which was described in the December 11 issue of Nature
Ion-trap technology uses electric and magnetic fields to isolate a charged particle from its environment – a prerequisite for exploiting the temperamental quantum properties of electrons. Although ion traps are just one technology for building a quantum computer, they have the longest history – the first trap was built in Monroe’s lab in 1995 – and they’ve advanced the furthest.
All quantum computation exploits the quantum nature of an electron’s spin or a photon’s polarity. Quantum theory dictates that until these properties are actually observed, they are indeterminate: the spin of an electron, for instance, can be “up” or “down,” or a combination of the two.
Therefore, in a quantum computer, an electron’s spin can represent a 1 and a 0 simultaneously, forming a qubit. Only when the electron’s spin is observed, nominally after a computation is complete, does a qubit correspond to a definitive value of 1 or 0. Because qubits compute with both values at once, the processing power of a quantum computer doubles with each additional qubit: two qubits can do the work of four conventional bits; three qubits, the work of eight, and so on.
But most ion traps are difficult to fabricate, consisting of a ceramic insulator and gold contacts for conducting an electrical current. In contrast, Monroe’s team built their chip out of insulating layers of alloys of aluminum, gallium, and arsenide, with semiconducting layers of gallium and arsenide – all easy to deposit on a chip using a conventional process called molecular beam epitaxy. They then etched a hole in the chip and “fashioned a set of cantilevered electrodes protruding over the hole,” says Monroe.
The chip is placed in a vacuum, which then gets injected with a vapor of cadmium ions. When the appropriate voltages are applied to the electrodes, a cadmium ion with a free electron becomes trapped, floating between the cantilevers above the etched hole. In order to actually use the atom’s free electron for computation, Monroe explains, the ion must be probed by a laser beam that reads the electron’s spin state. The challenge “was just a matter of getting the right combination of parameters,” such as electrode voltages and laser wavelengths (“and the phases of the moon,” Monroe jokes).
This first ion-trap chip builds on a quantum computing roadmap that Monroe, David Kielpinski at MIT, and David J. Wineland at NIST published in Nature in 2002. Their plan envisioned a large-scale quantum computer using multiple ion traps on a chip.
“The trick is to take [a single ion trap] that is reasonably well understood and make more of them,” says physicist Bruce Kane at the University of Maryland, who’s working on a silicon-based quantum computer.
Putting multiple traps on one chip presents difficulties, Kane says, because as the number of traps increase, it becomes more difficult for a laser to read the state of an individual electron without interfering with the state of other electrons.
Despite such challenges, though, Carl Williams, coordinator of the NIST Quantum Information Program, says the Michigan research is “another step forward” in quantum computing.
“They’ve used state-of-the-art fabrication techniques to design much more complicated traps so that they can actually build a quantum computer,” he says. “And that’s an important result.”
For quantum computers to be truly useful in cryptography, engineers will need to build in roughly 10,000 qubits, the number required to factor a 100-digit number, explains Kane.
As the number of traps increases, however, other potentially useful applications for quantum chips may appear before then. Says Kane: “I think most workers in the field would say, ‘Yeah, these goals are necessary – but we don’t understand fundamentally what could happen’ in the meantime.”.
Toronto wants to kill the smart city forever
The city wants to get right what Sidewalk Labs got so wrong.
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.