Embodiment, Computation And the Nature of Artificial Intelligence
One of the buzzwords in artificial intelligence research these days is ‘embodiment’, the idea that intelligence requires a body.

But in the last few years, a growing body researchers have begun to explore the possibility that this definition is too limited. Led by Rolf Pfeifer at the Artificial Intelligence Laboratory at the University of Zurich, Switzerland, these guys say that the notion of intelligence makes no sense outside of the environment in which it operates.
For them, the notion of embodiment must, of course, capture how the brain is embedded in a body but also how this body is embedded in the broader environment.
Today, Pfeifer and Matej Hoffmann, also at the University of Zurich, set out this thinking in a kind of manifesto for a new approach to AI. And their conclusion has far reaching consequences. They say it’s not just artificial intelligence that we need to redefine, but the nature of computing itself.
The paper takes the form of a number of case studies examining the nature of embodiment in various physical systems. For example, Pfeifer and Hoffmann look at the distribution of light-sensing cells within fly eyes.
Biologists have known for 20 years that these are not distributed evenly in the eye but are more densely packed towards the front of the eye than to the sides. What’s interesting is that this distribution compensates for the phenomenon of motion parallax.
When a fly is in constant forward motion, objects to the side move across its field of vision faster than those to the front. “This implies that under the condition of straight flight, the same motion detection circuitry can be employed for motion detection for the entire eye,” point out Pfeifer and Hoffmann.
That’s a significant advantage for the fly. With any other distribution of light sensitive cells, it would require much more complex motion detecting circuitry.
Instead, the particular distribution of cells simplifies the problem. In a sense, the morphology of the eye itself performs a computation. A few years a go, a team of AI researchers built a robot called Eyebot that exploited exactly this effect.
What’s important, however, is that the computation is the result of three factors: simple motion detection circuitry in the brain, the morphology or distribution of cells in the body and the nature of flight in a 3-dimensional universe.
Without any of these, the computation wouldn’t work and, indeed, wouldn’t make sense.
We’ve looked at examples of morphological computation on this blog in the past (here and here for example). And Pfeifer has been shouting from the roof tops for several years, with some success, about the role that shape and form play in biological computation.
But today he and Hoffman go even further. They say that various low level cognitive functions such as locomotion are clearly simple forms of computation involving the brain-body-environment triumvirate.
That’s why our definition of computation needs to be extended to include the influence of environment, they say.
For many simple actions, such as walking, these computations proceed more or less independently. These are ‘natural’ actions in the sense that they exploit the natural dynamics of the system.
But they also say it provides a platform on which more complex cognitive tasks can take place relatively easily. They think that systems emerge in the brain that can predict the outcome of these natural computations. That’s obviously useful for forward planning.
Pfeifer and Hoffmann’s idea is that more complex cognitive abilities emerge when these forward-planning mechanisms become decoupled from the system they are predicting.
That’s an interesting prediction that should lend itself to testing in the next few years.
But first, researchers will have to broaden the way they think not only about AI but also about the nature of computing itself.
Clearly an interesting and rapidly evolving field.
Ref: arxiv.org/abs/1202.0440 :The Implications of Embodiment for Behavior and Cognition: Animal and Robotic Case Studies
Keep Reading
Most Popular
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
GPT-4 is bigger and better than ChatGPT—but OpenAI won’t say why
We got a first look at the much-anticipated big new language model from OpenAI. But this time how it works is even more deeply under wraps.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.