Learning to hear
Only a few types of sound reach babies in the womb, but that may help them learn to process auditory input as they grow.

Human fetuses can begin to hear at around 20 weeks of gestation, but only low-frequency sounds penetrate the muffled environment of the womb. A new study suggests that this is a feature, not a bug.
Using simple computer models of human auditory processing, professor of vision and computational neuroscience Pawan Sinha, SM ’92, PhD ’95, and colleagues showed that performance on tasks such as identifying emotions from a voice clip was better when the input the model received while learning the task was initially limited to these low frequencies.
Along with a previous study by the same team, which showed that early exposure to blurry images of faces improves computer models’ subsequent performance in face recognition, the findings suggest that initially receiving low-quality sensory input may be key to some aspects of brain development, especially when it comes to absorbing information over larger areas or longer periods of time.
“Instead of thinking of the poor quality of the input as a limitation that biology is imposing on us, this work takes the standpoint that perhaps nature is being clever and giving us the right kind of impetus to develop the mechanisms that later prove to be very beneficial when we are asked to deal with challenging recognition tasks,” Sinha says.
In practical terms, the new findings suggest that babies born prematurely may benefit from being exposed to lower-frequency sounds rather than the full spectrum that they now hear in neonatal intensive care units, the researchers say.
Keep Reading
Most Popular
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.