Skip to Content

Tapping Quantum Effects for Software that Learns

Defense contractor Lockheed Martin paid $10 million for a “quantum computer” that is also being tested by Google.

In a bid to enable computers to learn faster, defense company Lockheed Martin has bought a system that uses quantum mechanics to process digital data. It paid $10 million to startup D-Wave Systems for the computer and support using it. D-Wave claims this to be the first ever sale of a quantum computing system.

Quantum calculation: At the center of this image, a series of prototype chips designed to use quantum mechanical effects to work with data.

The new system, called the D-Wave One, is not significantly more capable than a conventional computer. But it could be a step on the road to fuller implementations of quantum computing, which theoreticians have shown could easily solve problems that are impossible for other computers, such as defeating encryption systems by solving mathematical problems at incredible speed.

In a throwback to the days when computers were the size of rooms, the system bought by Lockheed, called the D-Wave One, occupies 100 square feet. Rather than acting as a stand-alone computer, it operates as a specialized helper to a conventional computer running software that learns from past data and makes predictions about future events. The defense company says it intends to use the new purchase to aid identification of bugs in products that are complex combinations of software and hardware. The goal is to reduce cost overruns caused by unforeseen technical problems with such systems, Lockheed spokesperson Thad Madden says. Such challenges were partly behind the recent news that the company’s F-35 strike fighter is more than 20 percent over budget.

At the heart of the D-Wave One is a processor made up of 128 qubits—short for quantum bits—which use magnetic fields to represent a single 1 or 0 of digital data at any time and can also exploit quantum mechanics to attain a state of “superposition” that represents both at once. When qubits in superposition states work together, they can work with exponentially more data than the equivalent number of regular bits.

Those qubits take the form of metal loops rich in niobium, a material that becomes a superconductor at very low temperatures and is more commonly used as the magnets inside MRI scanners. The qubits are linked by structures called couplers, also made from superconducting niobium alloy, which can control the extent to which adjacent magnetic fields, representing qubits, affect one another. Performing a calculation involves using magnetic fields to set the states of qubits and couplers, waiting a short time, and then reading out the final values from the qubits.

D-Wave’s machine is intended to do one thing better than a conventional computer: finding approximate answers to problems that can only be truly solved by exhaustively trying every possible solution. D-Wave runs a single algorithm, dubbed quantum annealing, which is hard-wired into the machine’s physical design, says Geordie Rose, D-Wave’s founder and CTO. Data sent to the chip is translated into qubit values and settings for the couplers that connect them. After that, the interlinked qubits go through a series of quantum mechanical changes that result in the solution emerging. “You stuff the problem into the hardware and it acts as a physical proxy for what you’re trying to solve,” says Rose. “All physical systems want to sink to the lowest energy level, with the most entropy,” he explains, “and ours sinks to a state that represents the solution.”

“You stuff the problem into the hardware and it acts as a physical proxy for what you’re trying to solve,” says Rose.

Although exotic, this hardware is intended to be used by software engineers who know nothing of quantum mechanics. A set of straightforward protocols—dubbed APIs for application programming interface—make it easy to push data to the D-Wave system in a standard format.

“You send in your problem and then get back a much more accurate result than you would on a conventional computer,” says Rose. He says tests have shown software using the D-Wave system can learn things like how to recognize particular objects in photos up to 9 percent more accurately than a conventional alternative. Rose predicts that the gap will rapidly widen as programmers learn to optimize their code for the way D-Wave’s technology behaves.

Google has been experimenting with D-Wave’s technology for several years as a way to speed up software that can interpret photos. The company’s software engineers use it as a kind of cloud service, accessing a system at D-Wave’s Vancouver headquarters over the Internet. In 2009, the company published papers showing that using the quantum system outperformed conventional software running in a Google data center.

Allan Snavelly at San Diego Supercomputer Center has used conventional versions of the algorithms like those that are built into D-Wave’s system. He says that the kind of “needle in a haystack” problems they are designed for are important in computer science. “These are problems where you know the right answer when you see it, but finding it among all the exponential space of possibilities is difficult,” he says. Being able to experiment with the new system using conventional software tools will be tempting to programmers, says Snavelly. “It’s intriguing to consider the possibilities—I would like to get my hands on one.”

D-Wave’s technology has been dogged by controversy during the 12 years it has been in development, with quantum computing researchers questioning whether the company’s technology truly is exploiting quantum effects. A paper published in the science journal Nature on May 12 went some way to addressing those concerns, reporting that the behavior of one of the eight-qubit tiles that make up the D-Wave One is better explained by a mathematical model assuming quantum effects at work than by one assuming only classical physics was involved.

However, the experiment did not show the results of running a computation on the hardware, leaving doubt in the minds of many quantum computing experts. Rose says the technology definitely uses quantum effects, but that to programmers only one thing really matters. “Compared to the conventional ways, you get a piece of software that is much better.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.