The average home computer is considered a classical computer, and it works in a straightforward way: information is processed in discrete chunks–bits that are represented, in computer language, as either a 1 or a 0. Conversely, quantum computers process information by using qubits, which register a 1and a 0 at the same time. This subtle difference gives quantum computers exponentially more power than classical computers.
Small-scale quantum computers–those that deal with a handful of qubits–have existed for a number of years. In the 1990s, says Sankar Das Sarma, a professor of physics at the University of Maryland, researchers proposed that the error caused by decoherence could be reversed after the fact using software, but that is still just a theory. Researchers tried to reduce the decoherence error by shielding their systems as best they could from environmental fluctuations. Eventually, researchers proposed theories in which error from decoherence could be lessened in the hardware of a quantum computer itself. However, Das Sarma says, it was thought to be too tricky to implement experimentally.
Biercuk explains that he and his colleagues borrowed some ideas for their research from the nuclear magnetic resonance research community, which has been around for decades but whose ideas were never applied specifically to quantum-computational systems. To implement the technique, the researchers measure and are aware of the characteristic environmental noise; knowing that, they can apply a series of magnetic pulses to their qubits at precise intervals to snap them back into a state of superposition. They’ve created different noise conditions that are common in other quantum-computing systems, such as those made in silicon, and modified the timing of the pulses accordingly, to prove that they work in those instances as well.
“It’s a nice technique,” says Seth Lloyd, a professor of mechanical engineering at MIT. “They took some well-known techniques from nuclear magnetic resonance, juiced them up, and turbocharged them.” In the near term, says Lloyd, the technique could be used for improving the accuracy of atomic clocks. “In the long term,” he says, “you might be able to use this to make a better quantum computer.”