A Collection of Articles
Edit

Biomedicine

Reading Baby Brains

New software and hardware opens a window into babies’ developing brains.

A maelstrom of neural connections develop in a child’s brain during the first five years of life. Understanding how interconnected circuits develop, and how babies think, could lead to a host of new insights into everything from autism to language acquisition. But gathering such information has been tricky: infants can’t be ordered to stay motionless, which is required for most advanced neuroimaging techniques. Now a system that works in concert with existing imaging machinery can account for head movement and, for the first time, let researchers see detailed activity in an active baby’s brain.

Deciphering young minds: Magnetoencephalography (MEG) (bottom) scans provide precise information on where neuron clusters are firing in the brain. A new setup at the University of Washington that combines a cap to monitor head position (top)lets researchers use it on infants and young children, such as the six-month-old pictured here.

Magnetoencephalography (MEG), a technology used to study brain function and to pinpoint diseased areas of the brain, capitalizes on the very weak magnetic fields created whenever a cluster of neurons fires at once. A helmet, resembling a salon hair dryer, with 306 sensors hovers over the subject’s head and detects where the magnetic pulses are occurring. Unlike magnetic resonance imaging (MRI) machines–which only show snapshots of data and require people to lie still inside a noisy, narrow tunnel while subjected to a powerful, rotating magnetic field–the MEG is pin-drop quiet and open, allowing subjects to interact with their surroundings. The resulting data can show researchers precisely where activity is occurring in the brain in real time.

Use of the technology in infants and young children has been limited because they typically need to be sedated to stay motionless long enough for traditional MEG machines to collect the necessary data. “The enemy for any kind of imaging, especially brain imaging, is movement,” says Sylvain Baillet, director of the MEG program at the Medical College of Wisconsin in Milwaukee. “It’s a bit like trying to take a picture of a child who’s constantly moving with a camera with a very slow aperture–the picture will be fuzzy.”

In order to study babies that were wide awake and socially engaged, researchers at the University of Washington’s Institute for Learning and Brain Studies (I-LABS) worked with Helsinki-based medical device company Elekta to create a “head-positioning” system remarkably similar to GPS. Scientists strap a soft nylon cap to the baby’s head. The cap has four embedded coils, each of which emits a high-frequency wavelength indicating its relative position at all times. As the hardware system tracks the skull’s movement, the software interprets the results and merges them with MEG-sensor data.

“For the first time, we can put babies and young children in this device while they’re engaged in a cognitive test,” says I-LABS codirector Patricia Kuhl. “Then, because you’re looking at the whole brain, you have the chance to look at interactions between various areas.” Some things stimulate neurons to fire in just a single region, while others trigger more complex neural responses across multiple locations of the brain–in children, this distinction is particularly important for understanding processes like language acquisition and potentially for diagnosing autism and other conditions.

So far, Kuhl and her colleagues have already seen differences in brain activation in infants who are presented with language spoken either by a person in the room with them, or the same person, speaking the same script, on a television. “Once we know what the difference is between live and television exposure, we would like to take same measures from children with autism,” Kuhl says. “They may be more engaged in a television set.”

Using a less exact technology called electroencephalography, or EEG, her group has previously found that typical children respond to a classic mothering voice, while children with autism are far more interested when the same tones are produced by a computer. However, EEG has poor spatial resolution, making it difficult to determine where in the brain these differences arise. Now these kinds of studies can be performed in greater detail with MEG to better understand the brain areas involved. “Brain measures are going to be extremely potent biomarkers of autism,” Kuhl says. Early diagnosis, before the first visible symptoms appear, may lead to early interventions.

“No one has done [MEGs on young children] in a systematic and vigorous way before,” says Steven Stufflebeam, the director of clinical magnetoencephalography at the Martinos Center for Biomedical Imaging at Massachusetts General Hospital. “If they do it, they may discover something brand new that could revolutionize the neuroscience of kids. But it’s a little hard now to predict what they’re going to find.”

Uh oh–you've read all five of your free articles for this month.

Insider basic

$29.95/yr US PRICE

Subscribe
What's Included
  • 1 year (6 issues) of MIT Technology Review magazine in print OR digital format
  • Access to the entire online story archive: 1997-present
  • Special discounts to select partners
  • Discounts to our events

You've read of free articles this month.