Brain implants, AI, and a speech synthesizer have turned brain activity into robot words
Neural networks have been used to turn words that a human has heard into intelligible, recognizable speech. It could be a step toward technology that can one day decode people’s thoughts.
A challenge: Thanks to fMRI scanning, we’ve known for decades that when people speak, or hear others, it activates specific parts of their brain. However, it’s proved hugely challenging to translate thoughts into words. A team from Columbia University has developed a system that combines deep learning with a speech synthesizer to do just that.
The study: The team temporarily placed electrodes in the brains of five people scheduled to have brain surgery for epilepsy. (People who have this procedure often have implants fitted to learn more about their seizures.) The volunteers were asked to listen to recordings of sentences, and their brain activity was used to train deep-learning-based speech recognition software. Then they listened to 40 numbers being spoken. The AI tried to decode what they had heard on the basis of their brain activity—and then spoke the results out loud in a robotic voice. What the voice synthesizer produced was understandable as the right word 75% of the time, according to volunteers who listened to it. The results were published in Scientific Reports today (and you can listen to the recordings here.)
The aim? At the moment the technology can only reproduce words that these five patients have heard—and it wouldn't work on anyone else. But there is the hope that technology like this could one day let people who have been paralyzed communicate with their family and friends, despite losing the ability to speak.
(For more stories like this, why not sign up for our daily newsletter, The Download.)
Deep Dive
Humans and technology
Enabling enterprise growth with data intelligence
It's becoming more critical for organizations to organize data and put data infrastructure at the forefront of their data strategy, says Bharti Patel, SVP of product engineering at Hitachi Vantara.
AI gains momentum in core manufacturing services functions
More use cases means customers and employees experience AI’s ability to automate tasks, prioritize work, and empower the user.
AI-powered 6G networks will reshape digital interactions
The convergence of AI and communication technologies will create 6G networks that make hyperconnectivity and immersive experiences an everyday reality for consumers.
Customer experience horizons
The big ideas that define the future of customer and employee experience.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.