Amoeboid Robot Navigates Without a Brain
A new blob-like robot described in the journal Advanced Robotics uses springs, feet, “protoplasm” and a distributed nervous system to move in a manner inspired by the slime mold Physarum polycepharum. Watch it ooze across a flat surface, The Blob style:
Skip to 1:00 if you just want to be creeped out by its life-like quivering. (And if anyone can explain why, aside from wanting to kill its creepiness, the researcher stabs it with a pen-knife at 1:40, let me know in the comments.)
Researcher Takuya Umedachi of Hiroshima University has been perfecting his blob-bot for years, starting with early prototypes that used springs but lacked an air-filled bladder.
This model didn’t work nearly as well, demonstrating, I guess, the need for a fluid or air-filled sack when you’re going to project your soft-bodied self in a new direction. (Hydraulic pressure is, after all, how our tongues work.)
Umedachi modeled his latest version on the “true” slime mold, which has been shown to achieve a “human-like” decision-making capacity through properties emerging from the interactions of its individual spores. Slime molds appear to have general computational abilities, and you’ve probably heard that they can solve mazes. Here’s what they look like in the wild.
Soft-bodied robots can do things their rigid, insectoid brethren can’t, like worm their way into tight spots and bounce back in the face of physical insult.
Umedachi’s goal isn’t simply to create a new kind of locomotion, however. He’s exploring the way in which robots that lack a centralized command center – i.e. a brain – can accomplish things anyway. Slime molds are a perfect model for this sort of thing, because they don’t even have the primitive neural nets that characterize the coordinated swimming and feeding actions in jellyfish.
From the abstract:
A fully decentralized control using coupled oscillators with a completely local sensory feedback mechanism is realized by exploiting the global physical interaction between the body parts stemming from the fluid circuit. The experimental results show that this robot exhibits adaptive locomotion without relying on any hierarchical structure. The results obtained are expected to shed new light on the design scheme for autonomous decentralized control systems.
Simulations indicate that the robot should be highly adaptable to deformation – i.e., squeezing through tight spaces.
For a full account of the ways that Umedachi plans to reproduce the world’s most primitive form of cognition in robots, here’s a 2011 talk on the subject by the professor himself.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.