Skip to Content

Replacing Lost Abilities with a Robot

A robot recently helped a quadriplegic shave himself for the first time in 10 years—but even the best mechanical helpers still need supervision.
July 19, 2011

Henry Evans recently shaved himself for the first time since a stroke left him mute and partly paralyzed 10 years ago. His achievement came thanks to researchers in robotics, not medicine, and it demonstrates the huge potential that robots have for assisting people with disabilities.

Grooming machine: This robot holding an electric shaver is giving a quadriplegic man new abilities.

Yet it also shows how much work still needs to be done to enable robots to work closely with humans. Each time Evans uses the robot, he must be accompanied by engineers ready to intervene if something goes wrong.

The techniques being developed to address this challenge could also prove useful in factories, where they could enable humans and robots to work together more closely on complex manufacturing tasks.

Evans has been using a two-armed robot on wheels known as a PR2, which was created by the private research lab Willow Garage.

Evans operates the robot by moving an on-screen cursor with head movements, and by clicking a button with one finger. Engineers at Willow Garage and the Healthcare Robotics Lab at Georgia Tech built a special user interface the runs on Evans’s computer to enable him to control the robot and give him views from cameras on the robot’s head and arms.

Evans can take direct control and steer the movements of its wheeled base and arms. He can also click on the video feeds from the camera to tell the robot  where to position one of its grippers, or where to grasp an object.

Evans can, for example, scratch his face by clicking on where his head appears in the video feed. That moves the robot’s gripper close enough for him to rub against it. He was able to shave himself in a similar manner after an electric shaver was attached to the robot’s gripper. Evans can also use the robot to move objects around and put them into drawers in another room.

“Anytime Henry is left alone, he is unable to do a single thing for himself,” says Steve Cousins, CEO of Willow Garage. “We’re showing how robots could give back independence to people in that situation.” Cousins hopes to recruit more people who could benefit from robotic assistance to join the research project.

But despite getting a lot out of using the robot, Evans cannot yet be left alone with PR2. “The first time that he wanted to scratch his nose, we were scared,” admits Cousins. Engineers need to be on hand at all times, he says, because of the robot’s rudimentary awareness and Evans’s vulnerability if something went wrong. “There’s a lot more work to do there to get it to a point where he could do things like scratch or shave on his own,” he says.

Making the robot safe enough to be left alone with Evans will require it to respond to commands more intelligently, and to cope with unexpected problems—such as a person getting in its way.

Some of these refinements are already in development. Willow Garage engineers, and a researcher from Rensselaer Polytechnic Institute, recently developed software that enables the robot to figure out for itself how to best grasp an object (see video).
                                     
Despite the challenges, Willow Garage’s state-of-the-art hardware shows the potential of robots as helpers for disabled people, says Rajiv Dubey, a professor in the Rehabilitation Robotics group at the University of South Florida. His research group is working on a robotic arm that attaches to a wheelchair and has experimented with allowing completely paralyzed people to operate robots using brain-computer interfaces.

Fortunately, robots don’t need to have human-level intelligence to help people like Evans. “You don’t need everything to be autonomous,” says Dubey. “You have a human in the loop, and you can combine that person’s cognitive abilities with the robot’s computational ones.”

In the future, this could mean that the person in charge of a robot will help it by indicating exactly where it should stand, or how to grip something. Other times, the robot will help—for instance, by preventing liquid from being spilled when the person takes direct control of the robot’s arm in order to sip from a cup.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.