The Dream controller (short for Dynamically-Resizing, Ergonomic and Multi-touch) is a new program designed for Microsoft’s Surface touch-screen that lets one or more people take control of a single robot, or a swarm of them. The software offers new ways for users to manage search-and-rescue robots through a touch screen interface, while integrating with virtual maps. Several virtual robot controllers are automatically integrated with a view of a virtual map.
When disaster strikes, search-and-rescue teams must quickly gather and assimilate the data needed to find survivors. A team of robots can help scout out for persons stuck in rubble or create new maps of the landscape. But first responders need ways to control those robots, and process incoming information quickly.
Robots are usually controlled using a physical device, like a joystick or games console-type controller. Mark Micire, a researcher at the University of Massachusetts Lowell who deployed search-and-rescue robots at the World Trade Center after 9/11 and a member of the Massachusetts FEMA team, built the Dream system to help first responders. Since they don’t need to use new device to control each of several robots while referring to a physical map, he says, it is possible to maneuver more quickly.
“Right now, the state of practice is to use paper maps, with everyone gathered around,” says Holly Yanco, professor and head of the Robotics Lab at UMass Lowell. “We’ve designed the multi-touch application to replace these maps with interactive ones.” This lets live data–satellite imagery, sensor data, and video or photography from people, vehicles or robots–to be used nearly instantaneously.
“With our design, a person can select the robot, then place his or her hands down to form the Dream controller to directly drive the robot and see the robot’s eye video,” says Yanco. “Once done, the controller disappears.”
Each controller sizes itself to a person’s hand space and finger size, based on contact points made with the screen–so very large or very small hands would be able to control it just as easily. “We aren’t aware of any other multi-touch controllers that conform to the placement and size of a person’s hand,” says Yanco. The Dream controller’s hand and finger registration algorithm, which has a patent pending, is faster and outperforms other work, according to Yanco.
Here’s a video of the controller being used controlling a real robot (ATRV-Jr):
And this video shows it controlling a swarm of virtual robots:
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Chinese hackers disguised themselves as Iran to target Israel
But they left a few clues that gave them away.
DeepMind says it will release the structure of every protein known to science
The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.