Skip to Content

Quantum Hardware

An innovative “ion trap” on a semiconductor chip could lay the foundation for mass production of quantum chips.
December 19, 2005

The quantum computer is the poster-technology for the next generation of computing, mainly because the computational power of such a machine could far exceed that of conventional computers. For instance, a quantum computer could quickly determine all the factors of a 300-digit number – a feat that’s impossible even for all of the world’s supercomputers combined.

Such a capability isn’t just a number-crunching trick either: factoring is fundamental to cryptography – and governments handsomely support research that could keep their information safe.

Despite grant money and the work of corporate, university, and government research groups – at IBM, Hewlett Packard, and the National Institute of Standards and Technology (NIST), to name a few – quantum computing has remained primarily in the lab (see Harnessing Quantum Bits, March 2003). Recently, however, researchers at the University of Michigan have pushed quantum computing closer to the real world with the fabrication of a key component, an “ion trap,” on a common semiconductor chip. This ion trap is so important because it can host a quantum bit, or “qubit,” the most fundamental element in quantum computing. The advance suggests one way that the hardware for quantum computing could be mass-produced.

“We, along with a small handful of other research groups across the world, are working hard to see how far ion trap technology can be pushed,” says Christopher Monroe, professor of physics at the University of Michigan in Ann Arbor, and leader of the ion trap study, which was described in the December 11 issue of Nature

Ion-trap technology uses electric and magnetic fields to isolate a charged particle from its environment – a prerequisite for exploiting the temperamental quantum properties of electrons. Although ion traps are just one technology for building a quantum computer, they have the longest history – the first trap was built in Monroe’s lab in 1995 – and they’ve advanced the furthest.

All quantum computation exploits the quantum nature of an electron’s spin or a photon’s polarity. Quantum theory dictates that until these properties are actually observed, they are indeterminate: the spin of an electron, for instance, can be “up” or “down,” or a combination of the two.

Therefore, in a quantum computer, an electron’s spin can represent a 1 and a 0 simultaneously, forming a qubit. Only when the electron’s spin is observed, nominally after a computation is complete, does a qubit correspond to a definitive value of 1 or 0. Because qubits compute with both values at once, the processing power of a quantum computer doubles with each additional qubit: two qubits can do the work of four conventional bits; three qubits, the work of eight, and so on.

But most ion traps are difficult to fabricate, consisting of a ceramic insulator and gold contacts for conducting an electrical current. In contrast, Monroe’s team built their chip out of insulating layers of alloys of aluminum, gallium, and arsenide, with semiconducting layers of gallium and arsenide – all easy to deposit on a chip using a conventional process called molecular beam epitaxy. They then etched a hole in the chip and “fashioned a set of cantilevered electrodes protruding over the hole,” says Monroe.

The chip is placed in a vacuum, which then gets injected with a vapor of cadmium ions. When the appropriate voltages are applied to the electrodes, a cadmium ion with a free electron becomes trapped, floating between the cantilevers above the etched hole. In order to actually use the atom’s free electron for computation, Monroe explains, the ion must be probed by a laser beam that reads the electron’s spin state. The challenge “was just a matter of getting the right combination of parameters,” such as electrode voltages and laser wavelengths (“and the phases of the moon,” Monroe jokes).

This first ion-trap chip builds on a quantum computing roadmap that Monroe, David Kielpinski at MIT, and David J. Wineland at NIST published in Nature in 2002. Their plan envisioned a large-scale quantum computer using multiple ion traps on a chip.

“The trick is to take [a single ion trap] that is reasonably well understood and make more of them,” says physicist Bruce Kane at the University of Maryland, who’s working on a silicon-based quantum computer.

Putting multiple traps on one chip presents difficulties, Kane says, because as the number of traps increase, it becomes more difficult for a laser to read the state of an individual electron without interfering with the state of other electrons.

Despite such challenges, though, Carl Williams, coordinator of the NIST Quantum Information Program, says the Michigan research is “another step forward” in quantum computing.

“They’ve used state-of-the-art fabrication techniques to design much more complicated traps so that they can actually build a quantum computer,” he says. “And that’s an important result.”

For quantum computers to be truly useful in cryptography, engineers will need to build in roughly 10,000 qubits, the number required to factor a 100-digit number, explains Kane.

As the number of traps increases, however, other potentially useful applications for quantum chips may appear before then. Says Kane: “I think most workers in the field would say, ‘Yeah, these goals are necessary – but we don’t understand fundamentally what could happen’ in the meantime.”.

Keep Reading

Most Popular

DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.

“This is a profound moment in the history of technology,” says Mustafa Suleyman.

What to know about this autumn’s covid vaccines

New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.

Human-plus-AI solutions mitigate security threats

With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure

Next slide, please: A brief history of the corporate presentation

From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.