Skip to Content

Kinect Turns Any Surface Into a Touch Screen

Researchers combine a Kinect sensor with a pico projector to expand the possibilities for interactive screens.

A new prototype can transform a notebook into a notebook computer, a wall into an interactive display, and the palm of your hand into a smart phone display. In fact, researchers at Microsoft and Carnegie Mellon University say their new shoulder-mounted device, called OmniTouch, can turn any nearby surface into an ad hoc interactive touch screen.

Hands-on “screen”: A proof-of-concept system allows smart phones to use virtually any surface as a touch-based interactive display.

OmniTouch works by bringing together a miniature projector and an infrared depth camera, similar to the kind used in Microsoft’s Kinect game console, to create a shoulder-worn system designed to interface with mobile devices such as smart phones, says co-inventor Chris Harrison, a postgraduate researcher at Carnegie Mellon’s Human-Computer Interaction Institute in Pittsburgh and a former intern at Microsoft Research. Instead of relying on screens, buttons, or keys, the system monitors the user’s environment for any available surfaces and projects an interactive display onto one or more of them.

OmniTouch does this automatically, using the depth information provided by the camera to build a 3-D model of the environment, says Harrison. The camera acquires depth information about the scene by emitting a patterned beam of infrared light and using the reflections to calculate where surfaces are in the room. This eliminates the need for external calibration markers. The system rebuilds the model dynamically as the user or the surface moves—for example, the position of a hand or the angle or orientation of a book—so the size, shape, and position of these projections match those of the improvised display surfaces, he says. OmniTouch “figures out what’s in front you and fits everything on to it.”

The system also monitors the environment for anything cylindrical and roughly finger-sized to work out when the user is interacting with it, again using depth information to determine when a finger or fingers make contact with a surface. This lets users interact with arbitrary surfaces just as they would a touch screen, says Harrison. Similarly, objects and icons on the ad hoc “screens” can be swiped and pinched to scroll and zoom, much like on a traditional touch screen. In one demonstration art application, for example, OmniTouch used a nearby wall or table as a canvas and the palm of the user’s hand as the color palette.

The shoulder-mounted setup is completely impractical, admits Hrvoje Benko, a researcher in Natural Interaction Research group at Microsoft Research in Redmond, Washington, who also worked on the project, along with colleague Andrew Wilson. “But it’s not where you mount it that counts,” he says. “The core motivation was to push this idea of turning any available surface into an interactive surface.” All the components used in OmniTouch are off the shelf and shrinking all the time. “So I don’t think we’re so far from it being made into a pendant or attached to glasses,” says Benko.

Duncan Brumby, a researcher at the University College London Interaction Center, in England, calls OmniTouch a fun and novel form of interaction. The screen sizes of mobile devices can be limiting, he says. “There’s a growing interest in this area of having ubiquitous, intangible displays embedded in the environment,” he says. And although new generations of smart phones tend to have increasingly higher-quality displays, Brumby reckons users would be willing to put up with lower-quality projected images, given the right applications.

Precisely which applications is hard to predict, says Harrison. “It’s an enabling technology, just like touch screens. Touch screens themselves aren’t that exciting,” he says—it’s what you do with them. But the team has built several sample applications; one allows users to virtually annotate a physical document, and another incorporates hand gestures to allow OmniTouch to infer whether the information being displayed should be made public or kept private.

“Using surfaces like this is not novel,” says Pranav Mistry a researcher at MIT’s Media Labs. Indeed, two years ago, Mistry demonstrated a system called SixthSense, which projected displays from a pendant onto nearby surfaces. In the original version, Mistry used markers to detect the user’s fingers, but he says that since then, he has also been using a depth camera. “The novelty here [with OmniTouch] is the technology,” he says. “The new thing is the accuracy and making it more robust.”

Indeed, the OmniTouch team tested the system on 12 subjects to see how it compared with traditional touch screens. Presenting their findings this week at the ACM Symposium on User Interface Software and Technology in Santa Barbara, the team showed it was possible to display incredibly small buttons, just 16.2 millimeters in size, before users had trouble clicking on them. With a traditional touch screen, the lower limit is typically around 15 millimeters, says Harrison.

Keep Reading

Most Popular

conceptual illustration of a heart with an arrow going in on one side and a cursor coming out on the other
conceptual illustration of a heart with an arrow going in on one side and a cursor coming out on the other

Forget dating apps: Here’s how the net’s newest matchmakers help you find love

Fed up with apps, people looking for romance are finding inspiration on Twitter, TikTok—and even email newsletters.

digital twins concept
digital twins concept

How AI could solve supply chain shortages and save Christmas

Just-in-time shipping is dead. Long live supply chains stress-tested with AI digital twins.

still from Embodied Intelligence video
still from Embodied Intelligence video

These weird virtual creatures evolve their bodies to solve problems

They show how intelligence and body plans are closely linked—and could unlock AI for robots.

computation concept
computation concept

How AI is reinventing what computers are

Three key ways artificial intelligence is changing what it means to compute.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.