Skip to Content
Computing

Quantum supremacy from Google? Not so fast, says IBM.

The rival maker of quantum computers is disputing the much-vaunted claim that Google has hit a new milestone.
Googles quantum computer
Googles quantum computerGoogle; Edited by MIT Technology Review

A month ago, news broke that Google had reportedly achieved “quantum supremacy”: it had gotten a quantum computer to run a calculation that would take a classical computer an unfeasibly long time. While the calculation itself—essentially, a very specific technique for outputting random numbers—is about as useful as the Wright brothers’ 12-second first flight, it would be a milestone of similar significance, marking the dawn of an entirely new era of computing.

But in a blog post published today, IBM disputes Google’s claim. The task that Google says might take the world’s fastest classical supercomputer 10,000 years can actually, says IBM, be done in just days.

As John Preskill, the CalTech physicist who coined the term “quantum supremacy,” wrote in an article for Quanta magazine, Google specifically chose a very narrow task that a quantum computer would be good at and a classical computer is bad at. “This quantum computation has very little structure, which makes it harder for the classical computer to keep up, but also means that the answer is not very informative,” he wrote.

Google’s research paper hasn’t been published (Update: it came out two days after this story), but a draft was leaked online last month. In it, researchers say they got a machine with 53 quantum bits, or qubits, to do the calculation in 200 seconds. They also estimated that it would take the world’s most powerful supercomputer, the Summit machine at Oak Ridge National Laboratory, 10,000 years to repeat it with equal “fidelity,” or the same level of uncertainty as the inherently uncertain quantum system.

The problem is that such simulations aren’t just a matter of porting the code from a quantum computer to a classical one. They grow exponentially harder the more qubits you’re trying to simulate. For that reason, there are a lot of different techniques for optimizing the code to arrive at a good enough equivalent.

And that’s where Google and IBM differ. The IBM researchers propose a method that they say would take just two and a half days on a classical machine “with far greater fidelity,” and that “with additional refinements” this could come down even further.

The key difference? Hard drives. Simulating a quantum computer in a classical one requires storing vast amounts of data in memory during the process to represent the condition of the quantum computer at any given moment. The less memory you have available, the more you have to slice up the task into stages, and the longer it takes. Google’s method, IBM says, relied heavily on storing that data in RAM, while IBM’s “uses both RAM and hard drive space.” It also proposes using a slew of other classical optimization techniques, in both hardware and software, to speed up the computation. To be fair, IBM hasn't tested it in practice, so it's hard to know if it would work as proposed. (Google declined to comment.)

So what’s at stake? Either a whole lot or not much, depending on how you look at it. As Preskill points out, the problem Google reportedly solved is of almost no practical consequence, and even as quantum computers get bigger, it will be a long time before they can solve any but the narrowest classes of problems. Ones that can crack modern codes will likely take decades to develop, at a minimum.

Moreover, even if IBM is right that Google hasn’t achieved it this time, the quantum supremacy threshold is surely not far off. The fact that simulations get exponentially harder as you add qubits means it may only take a slightly larger quantum machine to get to the point of being truly unbeatable at something.

Still, as Preskill notes, even limited quantum supremacy is “a pivotal step in the quest for practical quantum computers.” Whoever ultimately achieves it will, like the Wright brothers, get to claim a place in history.

Deep Dive

Computing

Inside the hunt for new physics at the world’s largest particle collider

The Large Hadron Collider hasn’t seen any new particles since the discovery of the Higgs boson in 2012. Here’s what researchers are trying to do about it.

Why China is betting big on chiplets

By connecting several less-advanced chips into one, Chinese companies could circumvent the sanctions set by the US government.

How Wi-Fi sensing became usable tech

After a decade of obscurity, the technology is being used to track people’s movements.

Algorithms are everywhere

Three new books warn against turning into the person the algorithm thinks you are.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.