Service dogs that open doors, switch on lights, and perform other useful tasks offer a much needed lifeline to people with disabilities. Now researchers at the Georgia Institute of Technology are developing robots that mimic the relationship between humans and their canine helpers.
Robotics researchers have long sought to create robots that can help out around the home. But while robots are good at carrying out preprogrammed tasks and following a clear trajectory, navigating a complex home environment and interacting with real people remains a formidable challenge.
Charles Kemp, a professor at Georgia Tech, believes that animal helpers may offer the ideal model for robotic assistants. He began by studying the way that helper monkeys–capuchins trained to perform useful tasks for disabled people–fetch an object or operate a device when it is highlighted with a laser pointer. “That got us excited about what we can learn from state-of-the-art biological systems,” says Kemp. It also inspired him and his colleagues to develop El-E, a robot that they trained to respond to commands given via a laser pen earlier this year.
More recently, Kemp and his student Hai Nguyen realized the potential of canine helpers after seeing a demonstration given by a charity called Georgia Canines for Independence. These dogs are trained to open doors, drawers, and cupboards and to fetch objects or operate lights when given a command. “We were amazed at what the dog could do [and] found out there’s a list of commands service dogs obey,” Kemp says. “That seemed like a great model to go by. If we could make a robot that obeys all those commands, we knew that we would have something valuable.”
The latest version of El-E has been upgraded so that, in addition to responding to a laser pointer, it understands voice commands and can perform a wider range of tasks. The robot can be commanded the same way as a service dog–to grab hold of a towel attached to a door, drawer, or cupboard when given the right vocal command. As with service dogs, towels help the robot with both perception and physical interaction. “[El-E] doesn’t know anything about the specific drawer or doors: it’s able to generalize with these commands,” says Kemp. “A towel is actually easy to grasp because you can be at many locations on it and still get a good grip.”
Smaller design teams can now prototype and deploy faster.