When the brain learns a new motor skill, neurons form circuits that activate the body’s muscles to perform it. But the same distributed network controls related motor tasks, so when you try to learn many skills at once, new modifications to existing patterns can interfere with previously learned skills.
“This is particularly tricky when you’re learning very similar things,” says Institute Professor Emilio Bizzi. A new computational model he and McGovern Institute researcher Robert Ajemian developed explains how the brain solves this problem.
The brain is massively parallel, and each neuron connects to about 10,000 others on average. This could make interference seem more likely than it is in standard computer chips, which process data serially and store instructions for each task in a separate location.
But that connectivity allows the brain to test out many possible solutions to achieve combinations of tasks. Neurons are constantly changing the strength of these connections, a trait known as hyperplasticity. They also receive about as much useless information as useful input from their neighbors.
Without that very low signal-to-noise ratio, the hyperplastic brain would overwrite existing memories too easily. But without hyperplasticity, the noise would drown out the tiny changes in connectivity, making it impossible to learn new skills.
“Your brain is always trying to find the configurations that balance everything so you can do two tasks, or three tasks, or however many you’re learning,” Ajemian says. “There are many ways to solve a task, and you’re exploring all the different ways.”
Toronto wants to kill the smart city forever
The city wants to get right what Sidewalk Labs got so wrong.
Chinese gamers are using a Steam wallpaper app to get porn past the censors
Wallpaper Engine has become a haven for ingenious Chinese users who use it to smuggle adult content as desktop wallpaper. But how long can it last?
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The US military wants to understand the most important software on Earth
Open-source code runs on every computer on the planet—and keeps America’s critical infrastructure going. DARPA is worried about how well it can be trusted
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.