Ng’s team developed an alternative that simplifies the process. Instead of collecting data about lots of points on an object, the researchers’ algorithm identifies the midpoint of a graspable portion of an object, such as a handle, by calculating the edges of an object and comparing this with the edges of statistically similar objects in the database. The software matches this point using both cameras and triangulates the distance. “This was the key idea that made all of our grasping things work,” Ng says. “We’ve now done things like load items from a dishwasher.”
Robots still need to learn the finer points of automatic manipulation, Ng adds. STAIR was designed only to grasp objects, and not to adjust its grasp depending on the situation. For instance, it wasn’t built to pour coffee from a pot–a task that might require a different grasp position and a different amount of pressure than simply picking up the pot and placing it on a shelf. Additionally, the software doesn’t know the consistency of the object–whether it’s squishy or solid. But researchers are working on these problems, and ultimately, a personal robot will have a combination of sensing technologies and different software that will allow it to pick up and manipulate an object. (See “Robots That Sense Before They Touch.”)
It could be years before all the technologies are integrated well enough so that robots can handle complex household chores on their own, but the Stanford work is pushing the dream forward. “If I had to pick one thing that’s holding back this vision of personal robotics, it would be the ability to pick things up and manipulate them,” says Josh Smith, senior research scientist at Intel Research, in Seattle. “We need more grasping strategies, like [the Stanford researchers’], that don’t require an explicit 3-D model of the object.” He adds that in addition to the robot having improved computer vision techniques, the actual hand of the robot will most likely have a number of sensors that can feel if an object is moving or if the grasp isn’t right. “Much richer sensing in the hand will be an important part of the solution,” Smith says.