Quantum computers capable of mind-boggling computations are finally on the horizon. But what will the first useful machines look like?
Industry heavy hitters including IBM, Google, Microsoft, and Intel, as well as a few startups like Rigetti Computing and Quantum Circuits Incorporated, are all making steady advances toward more capable quantum computers by using superconducting circuits cooled to extreme temperatures.
Meanwhile, two research teams have demonstrated that an approach largely ignored by industry—using trapped atoms to perform calculations—can be scaled up to a new level of complexity and used to perform valuable work. The resulting systems are not universal quantum computers capable of performing any calculation, but they suggest that an atomic approach may have more potential than presumed. The work also hints that atoms could ultimately offer a better way to turn laboratory systems into large-scale practical quantum computers.
The superconducting approach has proved successful partly because the engineering techniques used to fabricate silicon circuitry have been honed over the past several decades (see “10 Breakthrough Technologies 2017: Practical Quantum Computers”). But it is possible to build a quantum computer using a wide range of approaches.
In two papers published today in the journal Nature, a team at MIT and Harvard in Cambridge, Massachusetts, and another from the University of Maryland and the National Institute of Standards in Washington D.C., reveal that they have built specialized types of quantum calculator, each of which uses more than 50 qubits—well beyond what had been demonstrated previously. In both cases, the researchers created quantum simulators, machines capable of using analog calculations to model how quantum particles interact.
The two systems both use atoms but work in different ways. The MIT-Harvard system handles 51 qubits by using lasers to trap neutral atoms in an excited state. The Maryland-NIST machine, which handles 53 qubits, traps ytterbium ions in place using gold-coated electrodes. Together, they suggest that an alternative approach to building quantum machines might yet have the potential to challenge the one being pursued by industry.
“While our system does not yet constitute a universal quantum computer, we can effectively program it by controlling the interactions between the qubits,” says Mikhail Lukin, a physicist at Harvard who developed on of the systems in collaboration with Vladan Vuletic at MIT.
Will Zeng, a researcher at Rigetti Computing, a company that has received tens of millions in venture funding to pursue quantum computing, says quantum simulation at this scale is a significant step. In fact, simulating quantum effects was the original purpose for a quantum computer proposed by physicist Richard Feynman more than 40 years ago. Now scientists “are able to show some of the potential inherent in quantum computers, so the results are exciting,” he says.
Quantum computers work in a fundamentally different way from conventional computers. While a normal computer takes binary bits of information, encoded as either 1 or 0, and performs calculations on them one after another, a quantum computer exploits two counterintuitive features of quantum mechanism—entanglement and superposition—to perform calculations in parallel. As a result, it can calculate with large amounts of information in far less time. Several dozen quantum bits can perform computations on billions of pieces of information in one step.
The technology remained a pipe dream among physicists for years, but it undoubtedly has enormous potential. Excitement is now growing about finally building machines capable of doing useful work.
The 50-qubit benchmark is significant because around that point, quantum machines become capable of performing calculations that would be difficult, if not impossible, to run on even the most enormous supercomputer available. Some scientists refer to this as “quantum supremacy” (see “Google Reveals a Blueprint for Quantum Supremacy” and “IBM Raises the Bar with a 50-Qubit Quantum Computer”). Both IBM and Google are developing general-purpose superconducting quantum computers capable of using around the same number of qubits.
Perhaps more significant, the qubits in the new atomic systems may be better suited to scaling up, says Chris Monroe, a professor at the University of Maryland and the lead author on one of the papers. The qubits in solid-state systems are not identical, meaning a system needs to be carefully calibrated, and this can be tricky as the size of a machine grows. In contrast, qubits made using atoms, while more difficult to control, are identical and need no tuning. “Atoms are, in a sense, the perfect qubit,” Monroe says. He adds that atomic systems may prove easier to reconfigure, making them more suitable to tackling a wider range of problems.
That isn’t to say building larger, more practical quantum systems will be easy for anyone. “We think we can go to around a thousand quantum bits in a straightforward way, but the situation is less clear beyond that,” says Vuletic.
Just as important, we are only getting hints of how useful quantum computers will really be. In a landmark study published this September, a team at IBM used a quantum computer, called IBM Q, to simulate the structure of beryllium hydride, the most complex molecule ever analyzed in this way.
We probably won’t know what these machines are capable of until many more engineers and programmers get their hands on them. “We’re starting to move beyond the era of physics to quantum engineering,” says UMD’s Monroe.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Chinese hackers disguised themselves as Iran to target Israel
But they left a few clues that gave them away.
DeepMind says it will release the structure of every protein known to science
The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.