It’s an analogy that goes back to the dawn of the computer era: ever since we discovered that machines could solve problems by manipulating symbols, we’ve wondered if the brain might work in a similar fashion. Alan Turing, for example, asked what it would take for a machine to “think”; writing in 1950, he predicted that by the year 2000 “one will be able to speak of machines thinking without expecting to be contradicted.” If machines could think like human brains, it was only natural to wonder if brains might work like machines. Of course, no one would mistake the gooey material inside your brain for the CPU inside your laptop—but beyond the superficial differences, it was suggested, there might be important similarities.
Today, all these years later, experts are divided. Although everyone agrees that our biological brains create our conscious minds, they’re split on the question of what role, if any, is played by information processing—the crucial similarity that brains and computers are alleged to share.
While the debate may sound a bit academic, it actually has real-world implications: the effort to build machines with human-like intelligence depends at least in part on understanding how our own brains actually work, and how similar—or not—they are to machines. If brains could be shown to function in a way that was radically different from a computer, it would call into question many traditional approaches to AI.
The question may also shape our sense of who we are. As long as brains, and the minds they enable, are thought of as unique, humankind might imagine itself to be very special indeed. Seeing our brains as nothing more than sophisticated computational machinery could burst that bubble.
We asked the experts to tell us why they think we should—or shouldn’t—think of the brain as being “like a computer.”
AGAINST: The brain can’t be a computer because it’s biological.
Everyone agrees that the actual stuff inside a brain—“designed” over billions of years by evolution—is very different from what engineers at IBM and Google put inside your laptop or smartphone. For starters, brains are analog. The brain’s billions of neurons behave very differently from the digital switches and logic gates in a digital computer. “We’ve known since the 1920s that neurons don’t just turn on and off,” says biologist Matthew Cobb of the University of Manchester in the UK. “As the stimulus increases, the signal increases,” he says. “The way a neuron behaves when it’s stimulated is different from any computer that we’ve ever built.”
Blake Richards, a neuroscientist and computer scientist at McGill University in Montreal, agrees: brains “process everything in parallel, in continuous time” rather than in discrete intervals, he says. In contrast, today’s digital computers employ a very specific design based on the original von Neumann architecture. They work largely by going step by step through a list of instructions encoded in a memory bank, while accessing information stored in discrete memory slots.
“None of that has any resemblance to what goes on in your brain,” says Richards. (And yet, the brain keeps surprising us: in recent years, some neuroscientists have argued that even individual neurons can perform certain kinds of computations, comparable to what computer scientists call an XOR, or “exclusive or,” function.)
FOR: Sure it can! The actual structure is beside the point.
But perhaps what brains and computers do is fundamentally the same, even if the architecture is different. “What the brain seems to be doing is quite aptly described as information processing,” says Megan Peters, a cognitive scientist at the University of California, Irvine. “The brain takes spikes [brief bursts of activity that last about a tenth of a second] and sound waves and photons and converts it into neural activity—and that neural activity represents information.”
Richards, who agrees with Cobb that brains work very differently from today’s digital computers, nonetheless believes the brain is, in fact, a computer. “A computer, according to the usage of the word in computer science, is just any device which is capable of implementing many different computable functions,” says Richards. By that definition, “the brain is not simply like a computer. It is literally a computer.”
Michael Graziano, a neuroscientist at Princeton University, echoes that sentiment. “There’s a more broad concept of what a computer is, as a thing that takes in information and manipulates it and, on that basis, chooses outputs. And a ‘computer’ in this more general conception is what the brain is; that’s what it does.”
But Anthony Chemero, a cognitive scientist and philosopher at the University of Cincinnati, objects. “What seems to have happened is that over time, we’ve watered down the idea of ‘computation’ so that it no longer means anything,” he says. “Yes, your brain does stuff, and it helps you know things—but that’s not really computation anymore.”
FOR: Traditional computers might not be brain-like, but artificial neural networks are.
All of the biggest breakthroughs in artificial intelligence today have involved artificial neural networks, which use “layers” of mathematical processing to assess the information they’re fed. The connections between the layers are assigned weights (roughly, a number that corresponds to the importance of each connection relative to the others—think of how a professor might work out a final grade based on a series of quiz results but assign a greater weight to the final quiz). Those weights are adjusted as the network is exposed to more and more data, until the last layer produces an output. In recent years, neural networks have been able to recognize faces, translate languages, and even mimic human-written text in an uncanny way.
“An artificial neural network is actually basically just an algorithmic-level model of a brain,” says Richards. “It is a way of trying to model the brain without reference to the specific biological details of how the brain works.” Richards points out that this was the explicit goal of neural-network pioneers like Frank Rosenblatt, David Rumelhart, and Geoffrey Hinton: “They were specifically interested in trying to understand the algorithms that the brain uses to implement the functions that brains successfully compute.”
Scientists have recently developed neural networks whose workings are said to more closely resemble those of actual human brains. One such approach, predictive coding, is based on the premise that the brain is constantly trying to predict what sensory inputs it’s going to receive next; the idea is that “keeping up” with the outside world in this way boosts its chances for survival—something that natural selection would have favored. It’s an idea that resonates with Graziano. “The purpose of having a brain is movement—being able to interact physically with the external world,” he says. “That’s what the brain does; that’s the heart of why you have a brain. It’s to make predictions.”
AGAINST: Even if brains work like neural networks, they’re still not information processors.
Not everyone thinks neural networks support the notion that our brains are like computers. One problem is that they are inscrutable: when a neural network solves a problem, it may not be at all clear how it solved the problem, making it harder to argue that its method was in any way brain-like. “The artificial neural networks that people like Hinton are working on now are so complicated that even if you try to analyze them to figure out what parts were storing information about what, and what counts as the manipulation of that information, you’re not going to be able to pull that out,” says Chemero. “The more complicated they get, the more intractable they become.”
But defenders of the brain-as-computer analogy say that doesn’t matter. “You can’t point to the 1s and 0s,” says Graziano. “It’s distributed in a pattern of connectivity that was learned among all those artificial neurons, so it’s hard to ‘talk shop’ about exactly what the information is, where it’s stored, and how it’s encoded—but you know it’s there.”
FOR: The brain has to be a computer; the alternative is magic.
If you’re committed to the idea that the physical brain creates the mind, then computation is the only viable path, says Richards. “Computation just means physics,” he says. “The only other option is that you’re proposing some kind of magical ‘soul’ or ‘spirit’ or something like that ... There’s literally only two options: either you’re running an algorithm or you’re using magic.”
AGAINST: The brain-as-computer metaphor can’t explain how we derive meaning.
No matter how sophisticated a neural network may be, the information that flows through it doesn’t actually mean anything, says Romain Brette, a theoretical neuroscientist at the Vision Institute in Paris. A facial-recognition program, for example, might peg a particular face as being mine or yours—but ultimately it’s just tracking correlations between two sets of numbers. “You still need someone to make sense of it, to think, to perceive,” he says.
Which doesn’t mean that the brain doesn’t process information—perhaps it does. “Computation is probably very important in the explanation of the mind and intelligence and consciousness,” says Lisa Miracchi, a philosopher at the University of Pennsylvania. Still, she emphasizes that what the brain does and what the mind does are not necessarily the same. And even if the brain is computer-like, the mind may not be: “Mental processes are not computational processes, because they’re inherently meaningful, whereas computational processes are not.”
So where does that leave us? The question of whether the brain is or is not like a computer appears to depend partly on what we mean by “computer.” But even if the experts could agree on a definition, the question seems unlikely to be resolved anytime soon—perhaps because it is so closely tied to thorny philosophical problems, like the so-called mind-body problem and the puzzle of consciousness. We argue about whether the brain is like a computer because we want to know how minds came to be; we want to understand what allows some arrangements of matter, but not others, not only to exist but to experience.
A chip design that changes everything: 10 Breakthrough Technologies 2023
Computer chip designs are expensive and hard to license. That’s all about to change thanks to the popular open standard known as RISC-V.
Modern data architectures fuel innovation
More diverse data estates require a new strategy—and the infrastructure to support it.
Chinese chips will keep powering your everyday life
The war over advanced semiconductor technology continues, but China will likely take a more important role in manufacturing legacy chips for common devices.
The computer scientist who hunts for costly bugs in crypto code
Programming errors on the blockchain can mean $100 million lost in the blink of an eye. Ronghui Gu and his company CertiK are trying to help.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.