Robotics company Willow Garage has started a two-year project to work with institutions from around the world on new applications for its robot: the PR2. Each of 11 teams will work on their own projects, but will share their code with each other and the rest of the world. Everything created will be open-source, meaning others can use the code for their own endeavors. (The PR2 runs on a software platform called Robot Open Source, also developed by Willow Garage.)
Reminiscent of Johnny 5 from the movie “Short Circuit”, the PR2 has two compliant arms that are strong but capable of performing delicate tasks: the PR2 can turn the pages of a book, for example. The arms gather data about the forces applied to them to help them respond accordingly. Stereo cameras, laser scanning range finders, inertial measurement sensors and an array of other tools provide the necessary data regarding the robot’s environment to complete a wide range of tasks, including navigating a room and opening a door with a spring-loaded handle.
Each team hopes to expand the system’s skills. The team from Stanford University (where the technology behind the robot was born) is working on software for cleaning up a table and taking inventory. Folks at MIT’s CSAIL lab, meanwhile, will work on object recognition and putting away groceries. Bosch will develop skins for the robots to allow them to feel their environment. Using an earlier version of the robot, Pieter Abbeel’s lab at the University of California, Berkeley developed software for neatly folding towels. (Look out Gap employees! T-shirts could be next!)
“We want to get robots out of factories and into the real world,” said Willow Garage CEO Steve Cousins at a press conference yesterday in the company’s Menlo Park, CA offices.
I attended the event via another of the company’s creations: the Texai. It’s a bit like video conferencing while driving a remote-controlled car via the Internet. The robot consists primarily of a flat screen monitor, with audio and video recording equipment. Folks who looked at my screen saw my face as I sat in my living room in New Jersey. Using Skype, I was able to see and hear most of the press conference with ease. I got a good spot in the front row, and drove up to a few folks afterwards to ask follow-up questions. It was, however, a bit hard to hear some people while mingling in the noisy room after the event. But as long as the person was facing me directly, I could hear them just fine.
The only other oddity: because of the position of the camera on the Texai, it often seemed as though people were staring at my chest instead of looking me square in the eyes. But I suppose that happens a fair bit in real life, too.
How AI is reinventing what computers are
Three key ways artificial intelligence is changing what it means to compute.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Surgeons have successfully tested a pig’s kidney in a human patient
The test, in a brain-dead patient, was very short but represents a milestone in the long quest to use animal organs in human transplants.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.