Skip to Content

Scaling Up a Quantum Computer

A series of sustained quantum operations shows promise for developing a practical device.
August 7, 2009

Researchers at the National Institute of Standards and Technology (NIST) in Boulder, CO, have demonstrated multiple computing operations on quantum bits–a crucial step toward building a practical quantum computer.

Shine on, ions: Beryllium ions are trapped inside the dark slit on the left side of this chip. When researchers focus lasers on the ions, the ions can be used to perform quantum calculations.

Quantum computers have the potential to perform calculations far faster than the classical computers used today. This superior computing power comes from the fact that these computers use quantum bits, or qubits, which can represent both a 1 and a 0 at the same time, in contrast to classical bits that can represent only a 1 or a 0. Scientists take a number of different approaches to creating qubits. At NIST, the researchers use beryllium ions stored within so-called ion traps. Lasers are used to control the ions’ electronic states, depending on the frequency to which the laser light is tuned. The electronic states of the ions and their interactions determine the quantum operations that the machine performs.

Over the past few decades, researchers have made steady progress toward a quantum computer, for instance, by storing quantum data or performing logic operations on qubits. But the NIST work, which is published online today by the journal Science, pieces together several crucial steps for the first time. The work involved putting an ion into a desired state, storing qubit data in it, performing logical operations on one or two of the qubits, transferring that information among different locations, and finally reading out the qubit result individually. Importantly, the researchers show that they can perform one operation after another in a single experiment.

“This is the next step in trying to put a quantum computer together,” says Dave Wineland, lead researcher on the project. “It’s nice to have reached this stage.”

The NIST team performed five quantum logic operations and 10 transport operations (meaning they moved the qubit from one part of the system to another) in series, while reliably maintaining the states of their ions–a tricky task because the ions can easily be knocked out of their prepared state. In other words, the researchers had to be careful that they didn’t lose quantum combinations of 1s and 0s while they manipulated their ions.

One of the major problems in performing multiple operations is that the ions heat up after a single operation, in which laser beams, tuned to specific frequencies, adjust the energy level of electrons. Once this happens, explains Jonathan Home, a postdoctoral researcher at NIST, the researchers can’t do any further operations because the qubits can no longer hold both a 1 and a 0. To solve this problem, the researchers added magnesium ions to the mix. These ions are cooled with another set of lasers and, though the cold magnesium ions are not used for computation, they effectively chill the beryllium ions, keeping them in a stable state.

A second challenge when repeating operations inside this type of quantum computer is making sure that the ions are protected from stray magnetic fields that can also cause them to lose their quantum state. To solve this problem, the researchers chose specific energy-levels within which the ions are temporarily impervious to changes in surrounding magnetic fields. This maintains the qubit’s state for up to 15 seconds, plenty of time, says Home, to perform a series of millisecond-long operations. “Our particular choice of levels doesn’t change with the magnetic field,” he says. “We don’t have to worry about the lifetime of the qubits anymore.”

The experiment is a “milestone accomplishment,” says Isaac Chuang, a professor in the electrical engineering, computer science, and physics departments at MIT. “Very much like the early evolution of transistors into calculators, this work demonstrates a complete assembly of basic steps needed for a scalable quantum computer.” Chuang adds that the research “sets the bar” for other quantum computing systems.

In demonstrations, the researchers manipulated two qubits at a time. For ion trap systems, the maximum number of qubits used in varying experiments so far is less than 10. In order to outperform a classical computer, the researchers would need to perform operations on 30 or more qubits, suspects Home, something he thinks could happen in the next five to 10 years. While quantum computers hold promise for breaking ultrasecure encryption codes, he says that early quantum computers will mostly likely be used to simulate physical systems, for example, the electronic properties of materials.

But to get there, the researchers will need to improve their system. Currently, it performs with 94 percent accuracy. For a quantum computer to be reliable enough to use, it must be 99.99 percent accurate. A major factor affecting the accuracy of the system is the intensity fluctuations of the lasers that perform the operations on the ions. However, new, more-reliable, and more-powerful ultraviolet lasers could solve this problem, says Home.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.