Do-It-All Neurons

A key to cognitive flexibility.

Over the past few decades, neuroscientists have made great progress in mapping the brain by deciphering the functions of individual neurons that perform very specific tasks, such as recognizing the location or color of an object.

artist’s rendering of a neuron
Artist’s rendering of a neuron.

But many neurons, especially in brain regions that perform sophisticated functions such as thinking and planning, don’t fit into this pattern. Instead of responding exclusively to one stimulus or task, these neurons react in different ways to a wide variety of things.

MIT neuroscientist Earl Miller first noticed these unusual activity patterns about 20 years ago, while recording the electrical activity of neurons in animals trained to perform complex tasks. During one task, such neurons might distinguish between colors, but under different conditions, they might issue a motor command.

At the time, Miller and colleagues proposed that this type of neuronal flexibility is key to cognitive flexibility, which gives the brain its ability to learn so many new things on the fly. At first, that theory encountered resistance “because it runs against the traditional idea that you can figure out the clockwork of the brain by figuring out the one thing each neuron does,” Miller says.

In a recent paper published in Nature, Miller and colleagues at Columbia University described a computer model they developed to determine more precisely what role these flexible neurons play in cognition. They found that the cells are critical to the human brain’s ability to learn a large number of complex tasks.

Columbia professor Stefano Fusi created the model using experimental data gathered by Miller and his former grad student Melissa Warden, PhD ‘06. The data came from electrical recordings from brain cells of monkeys trained to look at a sequence of two pictures and remember the pictures and the order in which they appeared.

The computer model revealed that flexible neurons are critical to performing this kind of complex task, and they also greatly expand the capacity to learn many different things. In the computer model, neural networks without these flexible neurons could learn about 100 tasks before running out of capacity. That capacity expanded to tens of millions of tasks as flexible neurons were added to the model. When they reached about 30 percent of the total, the network’s capacity became “virtually unlimited,” Miller says—just like that of a human brain.

Tech Obsessive?
Become an Insider to get the story behind the story — and before anyone else.

Subscribe today

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! insider.prices.plus !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus ad-free web experience, select discounts to partner offerings and MIT Technology Review events

    See details+

    What's Included

    Bimonthly magazine delivery and unlimited 24/7 access to MIT Technology Review’s website

    The Download: our daily newsletter of what's important in technology and innovation

    Access to the magazine PDF archive—thousands of articles going back to 1899 at your fingertips

    Special discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.