Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Christopher Moore was teaching a group of doctors about functional magnetic resonance imaging (fMRI) 13 years ago when it dawned on him that he didn’t really believe what he was saying. A common brain-scanning technique, fMRI allows doctors and researchers to see local changes in blood flow, indicating where information processing is most active. Moore, who was at MIT working on his PhD in neuroscience at the time, told the doctors that blood was rushing to those areas to resupply hungry neurons with the oxygen and sugar their work was consuming. But it seemed to Moore that metabolism alone couldn’t account for the volume of blood showing up in the scans. “This makes no sense,” he thought. “There must be something else going on beyond just metabolism.”

The more he pondered the fMRI signal, the more it seemed to him that blood was not merely feeding neurons but directly helping neurons process information. He wasn’t yet in a position to test his hypothesis, but he knew that if he could prove it, it could change neuroscience. Now, new research suggests that he may have been right.

Because different regions of the brain are responsible for different kinds of information processing, the sensitivity of their circuitry–even the sensitivity of individual neurons–changes in response to new stimuli. That’s what makes it possible for us to react to the world around us–to come up with a witty riposte or hit a fastball, as the occasion demands. Moore thinks that increases and decreases in blood flow contribute to these shifts in sensitivity; that blood and neurons work hand in hand to produce perception and cognition. Given that neuroscientists have always attributed information processing to neurons alone, the notion that blood helps us think is radical.

It also has practical implications. If Moore is right, many brain disorders might be treatable in new ways. Drugs or devices that control blood flow, for instance, could be used to control problems, such as epileptic seizures, that can result when neurons become oversensitive.

To test whether blood modulates the sensitivity of neurons, Moore had to find a way to experimentally alter blood flow within single small vessels of the brain in living animals, and then to watch what happens in individual cells. Such fine control over blood flow in the brain had never previously been achieved. But this spring, Moore, an assistant professor in the Department of Brain and Cognitive Sciences and a principal investigator at the McGovern Institute for Brain Research, teamed up with assistant professor Edward Boyden, a neurotechnology whiz in the Media Lab, to tackle the challenge. They’ve embarked on a series of technically sophisticated experiments that are letting Moore test his 13-year-old hypothesis at last.

Pages

0 comments about this story. Start the discussion »

Credit: Dana Smith

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me