Skip to Content

The Power of Thought

Miguel Nicolelis continues to lead the way in neural-implant technology.

In January a rhesus monkey named Idoya did what no other creature has done before: she made a robot walk just by thinking about it. All Idoya had to do was imagine taking a step, and the robot would actually take it.

A 2001 Technology Review article described experiments done by Miguel Nicolelis, in which a monkey used brain signals to remotely control a simple robotic arm (shown in front of Nicolelis).

At the behest of signals sent over the Internet from electrodes in Idoya’s brain, the 200-pound robot began to walk on a treadmill in Kyoto, Japan. This while the monkey was on the other side of the world–in Miguel Nicolelis’s lab at the Center for Neuroengineering at Duke University in Durham, NC. This telekinetic remote control was the latest achievement made possible by Nicolelis’s research on a novel brain-machine interface–a technology singled out as one of the TR10 in 2001.

Nicolelis says his recent experiment shows that his neuroprosthetic system is close to fulfilling its promise of restoring mobility to paralyzed patients by means of an exoskeleton. This robotic support system would move limbs by the power of thought alone: a processor worn on the hip would translate brain signals into commands telling the exoskeleton to move however its wearer intended. In January, Nicolelis’s group launched a project to build the exoskeleton.

When Technology Review wrote about Nicolelis in 2001, his work was still in its infancy. The implanted electrodes could record the activity of just 90 neurons; while that allowed a monkey to control a robotic arm, the quality of control would quickly deteriorate. Yet the results, as described by senior associate editor ­Antonio Regalado, were a breakthrough:

Belle, a nocturnal owl monkey small enough to fit comfortably in a coat pocket, blinks her outsized eyes as a technician plugs four connectors into the sockets installed in the top of her skull. In the next room, measurements of the electrical signals from some 90 neurons in Belle’s brain pulse across the computer screen. Recorded from four separate areas of Belle’s cerebral cortex, the signals provide a window into what her brain is doing as she reaches to touch one of four assigned buttons to earn her reward–a few drops of apple juice. Miguel Nicolelis, a Duke University neurobiologist who is pioneering the use of neural implants to study the brain, points proudly to the captured data on the monitor and says: “This readout is one of a kind in the world.”

The same might be said of Nicolelis, who is a leader in a competitive and highly significant field. Only about a half-dozen teams around the world are pursuing the same goals: gaining a better understanding of how the mind works and then using that knowledge to build implant systems that would make brain control of computers and other machines possible. …

Nicolelis’s latest experiments … show that by tapping into multiple neurons in different parts of the brain, it is possible to glean enough information to get a general idea of what the brain is up to. In Belle’s case, it’s enough information to detect the monkey’s intention of making a specific movement a few tenths of a second before it actually happens. And it was Nicolelis’s team’s success at reliably measuring tens of neurons simultaneously over many months–previously a key technological barrier–that enabled the remarkable demonstration with the robot arm.

Nicolelis’s recent experiment involved recording the activity of 500 neurons. To animate the proposed exo­skeleton, he would like to send and receive information to and from up to 10,000 neurons–a difficult goal, but one he says can be reached. “The development of technology is not a straight line,” he says. “But we’re patient.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.