Skip to Content

Touch Screens for Many Fingers

Researchers have bigger plans for multi-touch screens than the novel interface on Apple’s iPhone.
January 18, 2007

When Steve Jobs demonstrated Apple’s new phone at Macworld recently, the feature that elicited the most “oohs” and “aahs” from the audience was the touch-screen interface: it allowed more than one touch at a time. This “multi-touch” technology adds functions such as allowing a person to easily zoom in and out of pictures and Web pages by pinching the screen with two fingers.

Large multi-touch displays enable two or more fingers to tap and trace on a surface. When combined with software, these screens could allow large-scale collaboration among many people.

But the full power of multi-touch technology might be unleashed in screens far larger than those on phones. Over the past few years, Jeff Han, consulting research scientist at New York University, has developed an inexpensive way to make large multi-touch screens accommodating 10, 20, or even more fingers. He envisions applications ranging from interactive whiteboards to touch-screen tables and digital walls–any of which could be manipulated by more than just one person. And this month, Han has unveiled Perceptive Pixel, his new company based on the technology.

“The new iPhone is too small to be a very interesting multi-touch device,” says Han, who demonstrates his technology on this YouTube video. That’s because multi-touch technology implies multiple users. More than one person gathered around a large touch screen “becomes interesting,” he says, “because multiple users can then become collaborators.” Such collaboration could take many forms, from brainstorming sessions using networked, interactive whiteboards to animation collaborations at which six hands can mould the face of a monster. Perceptive Pixel is set to ship its first wall-size touch screen this month, to an undisclosed U.S. military customer.

Various approaches to multi-touch technology have been demonstrated at engineering conferences since the 1980s. Mitsubishi Electric Research Labs developed the DiamondTouch table, which allows a group of people to sit around and collaborate on projects. Multi-touch screens “never completely went away, but they’re coming back in different ways, and for certain things they’re going to be really important,” says Bill Buxton, principal researcher at Microsoft Research.

Multimedia

  • Watch a video about Jeff Han’s multitouch screen.

There are many ways to make a multi-touch screen, Han explains. Some of the early designs measured the change in electrical resistance or capacitance on a surface when fingers touched it. But these devices have limited resolution, are relatively complex, and don’t easily and inexpensively scale up to large dimensions. Apple has not disclosed what multi-touch technology it’s using on the iPhone.

Han’s touch display is made of clear acrylic with light-emitting diodes attached to the edges, illuminating the six-millimeter-thick acrylic piece with infrared light. Normally, the light from the diodes reflects along predictable paths within the acrylic, a physical phenomenon called total internal reflection. However, once a finger or other object touches the acrylic, the internally reflecting light diffuses at the point of contact, scattering outside the surface. Behind the acrylic surface, there is a camera that captures this light. Using simple image-processing software, the captured scattering is interpreted in real time as discrete touches and strokes.

Many researchers who’ve been working for decades on touch technology are excited to see these developments. “For almost two decades, we’ve been trapped by the tyranny of the screen, the mouse, and the keyboard,” says Don Norman, professor at Northwestern University, in Chicago, and author of The Design of Future Things, to be published in October. “It’s nice to think we’re breaking away from that and going toward touch-screen manipulation in the real physical world.”

Some researchers are even developing touchable displays that can touch back. The emerging technology that enables this is called haptics. (See “The Cutting Edge of Haptics.”) One type of haptics technology involves a surface that senses when it’s touched and then vibrates at various frequencies, depending on the placement of one’s fingers. This sort of technology could be useful for the touch keyboard on Apple’s iPhone, says Scott Klemmer, professor of computer science at Stanford University. “You wouldn’t get the tactile feel of real buttons, but [because of the vibrations] you can tell you’ve touched a real button.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.