Skip to Content
Computing

New Twists in the Road to Quantum Supremacy

Quantum computers will soon surpass conventional ones, but it will take time to make the machines useful.
October 25, 2017
IBM

After decades of hype and headlines, quantum computers are finally poised to demonstrate their superiority over conventional machines.

Precisely when this will happen is a bit fuzzy, though. What’s more, it will be a while yet before these magical machines will have any noticeable impact on our lives.

The point at which a quantum machine should be able to perform computations too complex to model on any conventional machine, a landmark known as “quantum supremacy,” is believed to be about 49 qubits, the quantum equivalent of the bits that represent 1 or 0 in a conventional computer.

Google’s researchers appear to be leading in the race for a 49-qubit machine (see “Google’s New Chip Is a Stepping Stone to Quantum Computing Supremacy”). Earlier this week, however, researchers at IBM’s quantum research lab in Yorktown Heights, New York, demonstrated that it is possible to model the behavior of a quantum computer beyond the 49-qubit landmark by harnessing several clever mathematical techniques. IBM is also allowing programmers to experiment with its quantum computers through a cloud platform called IBM Q.

Two IBM quantum computing scientists, Hanhee Paik (left) and Sarah Sheldon, examine one of the company’s machines.
IBM

“We don’t think there will be a single landmark or metric to gauge the capability of a quantum computer,” says Bob Wisnieff, a researcher at IBM who’s involved with the new simulation work. “We are actively looking at methods that show quantum machines have an advantage over classical systems.

A quantum computer will need significantly more than 49 qubits in order to be useful. The best measure will be tackling real problems, and it remains unclear when that will become possible, though momentum is building.

To surpass what conventional computers can achieve by processing information in the form of conventional bits, quantum computers exploit the counterintuitive, probabilistic nature of physics at the atomic and subatomic scale. By harnessing superposition and entanglement—concepts that baffled and annoyed Einstein—these machines can compute in a fundamentally different way, carrying out immensely complex calculations at speeds that would otherwise be inconceivable (see “10 Breakthrough Technologies 2017: Practical Quantum Computers”).

Despite the back-and-forth over the measure of a quantum computer’s capacity, the consensus among experts is that reaching 49 qubits would still be a significant step. “Any system with lots of qubits is worthwhile, because to get to 1,000 or 1,000,000 qubits we need to deal with 100 first,” says Christopher Monroe, a professor at the University of Maryland who studies quantum information theory. Simulating quantum components plays an important role in driving progress toward more complex systems, as it isn’t always practical to test the behavior of a design using real hardware.

Both Google and IBM are developing their machines using superconducting circuits cooled to extreme temperatures. IBM has announced a 16-qubit machine, and Google is widely believed to have a 22-qubit machine, although the company has yet to officially confirm this.

IBM’s 16-qubit chip.

As they race to build the first practical quantum computers, those involved are also hustling to develop the software tools that will make these machines useful. In addition to IBM’s cloud platform, this week Google and a California-based startup called Rigetti Computing announced software for converting chemical simulations into a form that a quantum computer can handle. This new software, called OpenFermion, is freely available and designed to work with other quantum computers, including IBM’s.

Chemistry and materials science are the first target for quantum computing because the technology could offer a way to model the interactions of atoms at completely new levels of complexity (see “Chemists Are First in Line to Benefit from Quantum Computing”).

Monroe says efforts like IBM Q and OpenFermion will prove crucial in opening up potential applications of the technology as it scales up.

“I believe in the next five to 10 years we will have 100-plus-qubit machines that will be available to anyone, and this will be when useful applications will be found,” Monroe says. “My guess is that useful quantum applications will only be found once we build quantum machines that can be used by people who know about difficult problems in logistics, economic markets, pattern recognition, and modeling of materials.”

Interest is growing in whether quantum computers could also be useful for machine learning, although Andrew Childs, another professor at the University of Maryland, says this remains an open challenge. “There’s indeed a lot of buzz about quantum machine learning,” he says. “I think this area is very interesting, but its promise is far from clear.”

Scott Aaronson, a professor at the University of Texas at Austin and the head of its Quantum Information Center, said in a recent blog post that IBM’s paper on quantum supremacy did not diminish the importance of Google’s quantum supremacy goal.

Speaking via e-mail to MIT Technology Review, Aaronson also warned that the milestone will no doubt attract considerable hype. “Of course there’s a risk that quantum supremacy stuff will be overhyped and misunderstood,” he wrote. “In this field, what hasn’t been?”

Deep Dive

Computing

Inside the hunt for new physics at the world’s largest particle collider

The Large Hadron Collider hasn’t seen any new particles since the discovery of the Higgs boson in 2012. Here’s what researchers are trying to do about it.

Why China is betting big on chiplets

By connecting several less-advanced chips into one, Chinese companies could circumvent the sanctions set by the US government.

How Wi-Fi sensing became usable tech

After a decade of obscurity, the technology is being used to track people’s movements.

Algorithms are everywhere

Three new books warn against turning into the person the algorithm thinks you are.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.