Skip to Content

Scientists Are Defining Quantum-Computing Terms Because Everyone Is Confused

August 23, 2017

One person’s trapped ion is another’s electrostatically defined quantum dot. I’m talking about qubits, by the way—the quantum-computing equivalent of the bits in regular computers. But if you don’t quite follow, don’t worry: you’re far from being alone.

“Confusions exist on what quantum computing or a quantum computer means,” says Hidetoshi Nishimori, a professor at the Tokyo Institute of Technology who specializes in quantum computing. And if he thinks that, then God help the rest of us. But do not fear. For the Institute of Electrical and Electronics Engineers is so painfully aware of the confused language in quantum-computing circles—whether it’s to do with quantum tunneling, superposition, quantum entanglement, or something else entirely—that it’s kicked off a project to bring a little order and understanding to the proceedings.

The snappily titled IEEE P7130 Standard for Quantum Computing Definitions Project will corral experts and define the most important terms in the field so that everybody is reading from the same page. That, says the IEEE, will “make quantum computing more accessible to a larger group of contributors, including developers of software and hardware, materials scientists, mathematicians, physicists, engineers, climate scientists, biologists and geneticists.” The project’s only just starting, but it sounds as if it’s going to be very useful.

And, frankly, the timing is perfect. IBM, Google, Intel, and others are all racing to build the first practical quantum computer, which is why we made quantum computing one of our 10 Breakthrough Technologies of 2017. In fact, Google has already promised that it will build and test a quantum device that beats a regular computer—demonstrating so-called quantum supremacy—before the year is out, and IBM plans to do the same “over the next few years.” Maybe by then we’ll all know what the hell we’re talking about, too.

Keep Reading

Most Popular

Geoffrey Hinton tells us why he’s now scared of the tech he helped build

“I have suddenly switched my views on whether these things are going to be more intelligent than us.”

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

Meet the people who use Notion to plan their whole lives

The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.

Learning to code isn’t enough

Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.