A month ago, news broke that Google had reportedly achieved “quantum supremacy”: it had gotten a quantum computer to run a calculation that would take a classical computer an unfeasibly long time. While the calculation itself—essentially, a very specific technique for outputting random numbers—is about as useful as the Wright brothers’ 12-second first flight, it would be a milestone of similar significance, marking the dawn of an entirely new era of computing.
But in a blog post published today, IBM disputes Google’s claim. The task that Google says might take the world’s fastest classical supercomputer 10,000 years can actually, says IBM, be done in just days.
As John Preskill, the CalTech physicist who coined the term “quantum supremacy,” wrote in an article for Quanta magazine, Google specifically chose a very narrow task that a quantum computer would be good at and a classical computer is bad at. “This quantum computation has very little structure, which makes it harder for the classical computer to keep up, but also means that the answer is not very informative,” he wrote.
Google’s research paper hasn’t been published (Update: it came out two days after this story), but a draft was leaked online last month. In it, researchers say they got a machine with 53 quantum bits, or qubits, to do the calculation in 200 seconds. They also estimated that it would take the world’s most powerful supercomputer, the Summit machine at Oak Ridge National Laboratory, 10,000 years to repeat it with equal “fidelity,” or the same level of uncertainty as the inherently uncertain quantum system.
The problem is that such simulations aren’t just a matter of porting the code from a quantum computer to a classical one. They grow exponentially harder the more qubits you’re trying to simulate. For that reason, there are a lot of different techniques for optimizing the code to arrive at a good enough equivalent.
And that’s where Google and IBM differ. The IBM researchers propose a method that they say would take just two and a half days on a classical machine “with far greater fidelity,” and that “with additional refinements” this could come down even further.
The key difference? Hard drives. Simulating a quantum computer in a classical one requires storing vast amounts of data in memory during the process to represent the condition of the quantum computer at any given moment. The less memory you have available, the more you have to slice up the task into stages, and the longer it takes. Google’s method, IBM says, relied heavily on storing that data in RAM, while IBM’s “uses both RAM and hard drive space.” It also proposes using a slew of other classical optimization techniques, in both hardware and software, to speed up the computation. To be fair, IBM hasn't tested it in practice, so it's hard to know if it would work as proposed. (Google declined to comment.)
So what’s at stake? Either a whole lot or not much, depending on how you look at it. As Preskill points out, the problem Google reportedly solved is of almost no practical consequence, and even as quantum computers get bigger, it will be a long time before they can solve any but the narrowest classes of problems. Ones that can crack modern codes will likely take decades to develop, at a minimum.
Moreover, even if IBM is right that Google hasn’t achieved it this time, the quantum supremacy threshold is surely not far off. The fact that simulations get exponentially harder as you add qubits means it may only take a slightly larger quantum machine to get to the point of being truly unbeatable at something.
Still, as Preskill notes, even limited quantum supremacy is “a pivotal step in the quest for practical quantum computers.” Whoever ultimately achieves it will, like the Wright brothers, get to claim a place in history.
Start with data to build a better supply chain
Successful digital transformation starts with the right team, an agile mentality, and a strong data foundation, says global digital solutions manager of procurement and supply chain at bp, Raimundo Martinez.
Chiplets: 10 Breakthrough Technologies 2024
Chipmakers are betting that smaller, more specialized chips can extend the life of Moore’s Law.
Quantum computing is taking on its biggest challenge: noise
For a while researchers thought they’d have to make do with noisy, error-prone systems, at least in the near term. That’s starting to change.
Apple Vision Pro: 10 Breakthrough Technologies 2024
Micro-OLED technology has been in development for more than a decade, but the Vision Pro will be the highest-profile demonstration of its abilities to date.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.