Yesterday at EmTech’s “From the Labs: Cool Innovations” session, professor of computer science Holly Yanco from the University of Massachusetts, Lowell, discussed her robotic wheelchair project. She first demonstrated the difficulty of using a standard robotic arm attachment for wheelchairs by showing an over screen shot of complicated joystick instructions, which, she pointed out, many people don’t want to have to learn in order to command a robot to reach for an object. Instead, she is combining camera vision with touch screen technology, so that a camera will take a shot of objects in front of a shelf, for example, and display them on a touch screen. The user simply touches the object she wants on the screen and Yanco’s software lets the robot reach for it. This intuitive approach, she says, will make robotic assistants more useful for people. “My students are very inspired by video games,” says Yanco. Just as in video games, a more intuitive approach to the joystick tends to be more successful and result in a more enjoyable experience for the user.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate
Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.
10 Breakthrough Technologies 2023
These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway
Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.