Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The Dream controller (short for Dynamically-Resizing, Ergonomic and Multi-touch) is a new program designed for Microsoft’s Surface touch-screen that lets one or more people take control of a single robot, or a swarm of them. The software offers new ways for users to manage search-and-rescue robots through a touch screen interface, while integrating with virtual maps. Several virtual robot controllers are automatically integrated with a view of a virtual map.

When disaster strikes, search-and-rescue teams must quickly gather and assimilate the data needed to find survivors. A team of robots can help scout out for persons stuck in rubble or create new maps of the landscape. But first responders need ways to control those robots, and process incoming information quickly.

Robots are usually controlled using a physical device, like a joystick or games console-type controller. Mark Micire, a researcher at the University of Massachusetts Lowell who deployed search-and-rescue robots at the World Trade Center after 9/11 and a member of the Massachusetts FEMA team, built the Dream system to help first responders. Since they don’t need to use new device to control each of several robots while referring to a physical map, he says, it is possible to maneuver more quickly.

“Right now, the state of practice is to use paper maps, with everyone gathered around,” says Holly Yanco, professor and head of the Robotics Lab at UMass Lowell. “We’ve designed the multi-touch application to replace these maps with interactive ones.” This lets live data–satellite imagery, sensor data, and video or photography from people, vehicles or robots–to be used nearly instantaneously.

“With our design, a person can select the robot, then place his or her hands down to form the Dream controller to directly drive the robot and see the robot’s eye video,” says Yanco. “Once done, the controller disappears.”

Each controller sizes itself to a person’s hand space and finger size, based on contact points made with the screen–so very large or very small hands would be able to control it just as easily. “We aren’t aware of any other multi-touch controllers that conform to the placement and size of a person’s hand,” says Yanco. The Dream controller’s hand and finger registration algorithm, which has a patent pending, is faster and outperforms other work, according to Yanco.

Here’s a video of the controller being used controlling a real robot (ATRV-Jr):

And this video shows it controlling a swarm of virtual robots:

2 comments. Share your thoughts »

Tagged: Computing, robotics, robots, touch screen, Microsoft Surface, search and rescue

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me