An inexpensive new prototype device called the Obake adds a new dimension to touch screen technology. The surface of the device, developed by Dhairya Dand and Rob Hemsley of the MIT Media Lab, can react to how it’s being used by reaching out toward the user. It was relatively simple to make: the researchers used an open source software framework to enable the screen to react; the hardware costs between $50 and $60, Dand says.
Six specialized motors located below a silicone liquid rubber screen control the screen’s movement. Push, pry, prod, pinch, poke and the surface is malleable enough to move. (Watch a demo here).
A small microphone below the screen picks up the noise of vibrations when a finger touches the surface. Cameras mounted above detect the movement of a user’s hand; an overhead projector is used to display images.
“There are many ways to detect,” says Dand. A matrix of bend sensors embedded in the material has been shown to achieve the same effect as depth cameras. “And you don’t need the projector on top. You can have it from the side or from the bottom,” he says.
Dand envisions many potential applications. The “display opens up all sorts of new options,” he adds. “Like for Excel tables with row and columns. What if it could pop out a chart bar and change the data?”
The name Obake comes from a shape-shifting mythological Japanese creature. Dand and Hemsley say they were heavily influenced by nature in their design, specifically in the malleability and feel of water. “Things that you can actually feel and touch, they have their own inherent beauty. That’s how nature works,” Dand says.
But both recognize that people are accustomed to what they have now. For today’s touchscreens, the world is still flat. “We need a paradigm shift in user interface,” says Dand.