Skip to Content

The Invisible iPhone

A new interface lets you keep your phone in your pocket and use apps or answer calls by tapping your hand.

Over time, using your smart-phone touch screen becomes second nature, to the point where you can even do some tasks without looking. Researchers in Germany are now working on a system that would let you perform such actions without even holding the phone—instead you’d tap your palm, and the movements would be interpreted by an “imaginary phone” system that would relay the request to your actual phone.

Point and click: The “imaginary phone” determines which iPhone app a person wants to use by matching his or her finger position to the position of the app on the screen.

The concept relies on a depth-sensitive camera to pick up the tapping and sliding interactions on a palm,  software to analyze the video, and a wireless radio to send the instructions back to the iPhone. Patrick Baudisch, professor of computer science at the Hasso Plattner Institute in Potsdam, Germany, says the imaginary phone prototype “serves as a shortcut that frees users from the necessity to retrieve the actual physical device.”

Baudisch and his team envision someone doing dishes when his smart phone rings. Instead of quickly drying his hands and fumbling to answer, the imaginary phone lets him simply slide a finger across his palm to answer it remotely.

The imaginary phone project, developed by Baudisch and his team, which includes Hasso Plattner Institute students Sean Gustafson and Christian Holz, is reminiscent of a gesture-based interface called SixthSense developed by Pattie Maes and Pranav Mistry of MIT, but it differs in a couple of significant ways. First, there are no new gestures to learn—the invisible phone concept simply transfers the iPhone screen onto a hand. Second, there’s no feedback, unlike SixthSense, which uses a projector to provide an interface on any surface. Lack of visual feedback limits the imaginary phone, but it isn’t intended to completely replace the device, just to make certain interactions more convenient.

Last year, Baudisch and Gustafson developed an interface in which a wearable camera captures gestures that a person makes in the air and translates them to drawings on a screen.

For the current project, the researchers used a depth camera similar to the one used in Microsoft’s Kinect for Xbox, but bulkier and positioned on a tripod. (Ultimately, a smaller, wearable depth camera could be used.) The camera “subtracts” the background and tracks the finger position on the palm. It works well in various lighting conditions, including direct sunlight. Software interprets finger positions and movements and correlates it to the position of icons on a person’s iPhone. A Wi-Fi radio transmits these movements to the phone.

In a study that has been submitted to the User Interface Software and Technology conference in October, the researchers found that participants could accurately recall the position of about two-thirds of their iPhone apps on a blank phone and with similar accuracy on their palm. The position of apps used more frequently was recalled with up to 80 percent accuracy.

Finger mouse: A depth camera picks up finger position and subtracts the background images to correctly interpret interactions.

“It’s a little bit like learning to touch type on a keyboard, but without any formal system or the benefit of the feel of the keys,” says Daniel Vogel, postdoctoral fellow at the University of Waterloo. Vogel wasn’t involved in the research. He notes that “it’s possible that voice control could serve the same purpose, but the imaginary approach would work in noisy locations and is much more subtle than announcing, ‘iPhone, open my e-mail.’ “

Keep Reading

Most Popular

conceptual illustration of a heart with an arrow going in on one side and a cursor coming out on the other
conceptual illustration of a heart with an arrow going in on one side and a cursor coming out on the other

Forget dating apps: Here’s how the net’s newest matchmakers help you find love

Fed up with apps, people looking for romance are finding inspiration on Twitter, TikTok—and even email newsletters.

digital twins concept
digital twins concept

How AI could solve supply chain shortages and save Christmas

Just-in-time shipping is dead. Long live supply chains stress-tested with AI digital twins.

still from Embodied Intelligence video
still from Embodied Intelligence video

These weird virtual creatures evolve their bodies to solve problems

They show how intelligence and body plans are closely linked—and could unlock AI for robots.

computation concept
computation concept

How AI is reinventing what computers are

Three key ways artificial intelligence is changing what it means to compute.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.