Skip to Content

Q&A: Seth Lloyd

A pioneer of quantum computing believes the universe is a quantum computer.

Seth Lloyd, a professor of mechanical engineering at MIT, is among the pioneers of quantum computing: he proposed the first technologically feasible design for a quantum computer. If humans ever build a useful, general-purpose quantum computer, it will owe much to Lloyd. Earlier this year, he published a popular introduction to quantum theory and computing, titled Programming the Universe, which advanced the startling thesis that the universe is itself a quantum computer.

Credit: Ed Quinn

Technology Review:In your new book, you are admirably explicit: you write, “The Universe is indistinguishable from a quantum computer.” How can that be true?

Seth Lloyd: I know it sounds crazy. I feel apologetic when I say it. And people who have reviewed the book take it as a metaphor. But it’s factually the case. We couldn’t build quantum computers unless the universe were quantum and computing. We can build such machines because the universe is storing and processing information in the quantum realm. When we build quantum computers, we’re hijacking that underlying computation in order to make it do things we want: little and/or/not calculations. We’re hacking into the universe.

TR: Your critics can be forgiven for thinking you wrote metaphorically. In every era, scientists have likened the universe to the most complicated technology they knew. Newton thought the universe was like a clock.

SL: You could be more blunt: “Lloyd builds quantum computers; therefore, Lloyd thinks the universe is a quantum computer.” But I think that’s unfair.

TR: You famously believe in “it from bit”: that is, that information is a physical property of the universe, and that information generates more-complex information – and with it, all the phenomenal world.

SL: Imagine the electron, which an ordinary computer uses to store data. How can it have information associated with it? The electron can be either here or there. So it registers a bit of information, one of two possibilities: on or off.

TR: Sure, but how does the quantity of information increase?

SL: If you’re looking for places where the laws of physics allow for information to be injected into the universe, then you must look to quantum mechanics. Quantum mechanics has a process called “decoherence” – which takes place during measurement, for instance. A qubit [or quantum bit] that was, weirdly, both here and there is suddenly here or there. Information has been added to the universe.

TR: And why does the universe tend to complexity?

SL: This notion of the universe as a giant quantum computer gets you something new and important that you don’t get from the ordinary laws of physics. If you look back 13.8 billion years to the beginning of the universe, the Initial State was extremely simple, only requiring a few bits to describe. But I see on your table an intricate, very beautiful orchid – where the heck did all that complex information come from? The laws of physics are silent on this issue. They have no explanation. They do not encode some yearning for complexity.

TR: [Utterly bemused] Hmmm …

SL: Could the universe have arisen from total randomness? No. If we imagine that every elementary particle was a monkey typing since time began at the maximum speed allowed by the laws of physics, the longest stretch of Hamlet that could have been generated is something like “To be or not to be, that is the – .” But imagine monkeys typing at computers that recognize the random gibberish as a program. Algorithmic information theory shows that there are short, random-looking programs that can cause a computer to write down all the laws of physics. So for the universe to be complex, you need random generation, and you need something to process that information according to a few simple rules: in other words, a quantum computer.

TR: More practically: how far are we from widely used, commercial applications of quantum computing?

SL: Today, the largest general-purpose quantum computer is only a dozen bits. So we’re at least a decade or two away. But we’ve already built quantum computers that simulate other quantum systems: you could call them quantum analog computers. These little machines can perform computations that would require an ordinary computer larger than the universe.

TR: What’s the next big thing that needs to be done in quantum computing?

SL: From the techno-geek, experimentalist point of view, it’s the pacification of the microscopic, quantum world. It’s the Wild West down there.

TR: Programming the Universe concludes with a personal note. You describe how your friend Heinz Pagels, a renowned physicist, fell to his death while hiking with you in Colorado. You find some consolation in your theory of universal quantum computation: “But we have not entirely lost him. While he lived, Heinz programmed his own piece of the universe. The resulting computation unfolds in us and around us …”

SL: Well, it’s pretty poor consolation when someone you love is dead. But it’s a truer consolation than the idea that one day you might meet him in heaven.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.