When the brain learns a new motor skill, neurons form circuits that activate the body’s muscles to perform it. But the same distributed network controls related motor tasks, so when you try to learn many skills at once, new modifications to existing patterns can interfere with previously learned skills.
“This is particularly tricky when you’re learning very similar things,” says Institute Professor Emilio Bizzi. A new computational model he and McGovern Institute researcher Robert Ajemian developed explains how the brain solves this problem.
The brain is massively parallel, and each neuron connects to about 10,000 others on average. This could make interference seem more likely than it is in standard computer chips, which process data serially and store instructions for each task in a separate location.
But that connectivity allows the brain to test out many possible solutions to achieve combinations of tasks. Neurons are constantly changing the strength of these connections, a trait known as hyperplasticity. They also receive about as much useless information as useful input from their neighbors.
Without that very low signal-to-noise ratio, the hyperplastic brain would overwrite existing memories too easily. But without hyperplasticity, the noise would drown out the tiny changes in connectivity, making it impossible to learn new skills.
“Your brain is always trying to find the configurations that balance everything so you can do two tasks, or three tasks, or however many you’re learning,” Ajemian says. “There are many ways to solve a task, and you’re exploring all the different ways.”
How a Russian cyberwar in Ukraine could ripple out globally
Soldiers and tanks may care about national borders. Cyber doesn't.
Meet Altos Labs, Silicon Valley’s latest wild bet on living forever
Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Meta’s new learning algorithm can teach AI to multi-task
The single technique for teaching neural networks multiple skills is a step towards general-purpose AI.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.