Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Using neural activity recorded from a sheet of electrodes laid directly on the surface of a patient’s brain, scientists can predict the movement of fingers, as well as which of several sounds the patient is imagining. Eventually, researchers hope to use the findings to develop intuitive neural prostheses, such as a robotic hand that moves its fingers with as little mental effort as it takes to move real ones, or a computer interface that detects imagined words. To realize this vision, scientists are also developing smaller, more flexible technology, which could be more easily implanted and make better contact with the brain. Details of the latest brain-computer interface technology were presented this week at the Society for Neurosciences conference in Washington, DC.

“It could create the basis for a brain-computer interface that is very intuitive, and a recording platform that is very robust,” says Gerwin Schalk, a research scientist at the Wadsworth Center, in Albany, NY, who led one of the projects.

Schalk and his colleagues studied epilepsy patients undergoing a procedure known as electrocorticography (ECoG), in which a flat array of electrodes is laid over an exposed section of cortex to record electrical activity. Normally, surgeons use this information to pinpoint the source of seizures and to map the location of specific brain functions, which must be avoided during surgery. The technique generates a better spatial resolution than electroencephalography (EEG), a noninvasive approach that records activity through the scalp. ECoG is now being explored for use in brain-computer interfaces. “There’s a growing interest in use of ECoG signals because nothing penetrates into the brain, and that appeals to people more than penetrating electrodes,” says Marc Schieber, a physician and scientist at the University of Rochester Medical School, who was not involved in the research.

Schalk and his collaborators recorded electrical activity from the motor cortex and Broca’s area, a part of the brain involved in speech, in five patients as they moved their hands and fingers in specific ways and vocalized or imagined specific sounds. The researchers then used specially developed algorithms to search the neural activity for patterns relating to a certain movement or sound. “We can tell you how they are flexing each of their fingers,” says Schalk. What’s more, the researchers could determine in real time which of two sounds a patient was imagining. This kind of information could be used to control a brain-computer interface, providing a lifeline for people with severe paralysis, such as that associated with end-stage amyotrophic lateral sclerosis, a neurodegenerative disease, or locked-in syndrome, the result of a specific kind of stroke that leaves the patient unable to move or communicate.

0 comments about this story. Start the discussion »

Credit: Gerwin Schalk/Journal of Neural Engineering
Video by Justin Williams

Tagged: Computing, Biomedicine, electrodes, prosthesis, brain-machine interface, neural implant, ECoG

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me