Yesterday at EmTech’s “From the Labs: Cool Innovations” session, professor of computer science Holly Yanco from the University of Massachusetts, Lowell, discussed her robotic wheelchair project. She first demonstrated the difficulty of using a standard robotic arm attachment for wheelchairs by showing an over screen shot of complicated joystick instructions, which, she pointed out, many people don’t want to have to learn in order to command a robot to reach for an object. Instead, she is combining camera vision with touch screen technology, so that a camera will take a shot of objects in front of a shelf, for example, and display them on a touch screen. The user simply touches the object she wants on the screen and Yanco’s software lets the robot reach for it. This intuitive approach, she says, will make robotic assistants more useful for people. “My students are very inspired by video games,” says Yanco. Just as in video games, a more intuitive approach to the joystick tends to be more successful and result in a more enjoyable experience for the user.