When Steve Jobs demonstrated Apple’s new phone at Macworld recently, the feature that elicited the most “oohs” and “aahs” from the audience was the touch-screen interface: it allowed more than one touch at a time. This “multi-touch” technology adds functions such as allowing a person to easily zoom in and out of pictures and Web pages by pinching the screen with two fingers.
But the full power of multi-touch technology might be unleashed in screens far larger than those on phones. Over the past few years, Jeff Han, consulting research scientist at New York University, has developed an inexpensive way to make large multi-touch screens accommodating 10, 20, or even more fingers. He envisions applications ranging from interactive whiteboards to touch-screen tables and digital walls–any of which could be manipulated by more than just one person. And this month, Han has unveiled Perceptive Pixel, his new company based on the technology.
“The new iPhone is too small to be a very interesting multi-touch device,” says Han, who demonstrates his technology on this YouTube video. That’s because multi-touch technology implies multiple users. More than one person gathered around a large touch screen “becomes interesting,” he says, “because multiple users can then become collaborators.” Such collaboration could take many forms, from brainstorming sessions using networked, interactive whiteboards to animation collaborations at which six hands can mould the face of a monster. Perceptive Pixel is set to ship its first wall-size touch screen this month, to an undisclosed U.S. military customer.
Various approaches to multi-touch technology have been demonstrated at engineering conferences since the 1980s. Mitsubishi Electric Research Labs developed the DiamondTouch table, which allows a group of people to sit around and collaborate on projects. Multi-touch screens “never completely went away, but they’re coming back in different ways, and for certain things they’re going to be really important,” says Bill Buxton, principal researcher at Microsoft Research.
Watch a video about Jeff Han’s multitouch screen.
There are many ways to make a multi-touch screen, Han explains. Some of the early designs measured the change in electrical resistance or capacitance on a surface when fingers touched it. But these devices have limited resolution, are relatively complex, and don’t easily and inexpensively scale up to large dimensions. Apple has not disclosed what multi-touch technology it’s using on the iPhone.
Han’s touch display is made of clear acrylic with light-emitting diodes attached to the edges, illuminating the six-millimeter-thick acrylic piece with infrared light. Normally, the light from the diodes reflects along predictable paths within the acrylic, a physical phenomenon called total internal reflection. However, once a finger or other object touches the acrylic, the internally reflecting light diffuses at the point of contact, scattering outside the surface. Behind the acrylic surface, there is a camera that captures this light. Using simple image-processing software, the captured scattering is interpreted in real time as discrete touches and strokes.
Many researchers who’ve been working for decades on touch technology are excited to see these developments. “For almost two decades, we’ve been trapped by the tyranny of the screen, the mouse, and the keyboard,” says Don Norman, professor at Northwestern University, in Chicago, and author of The Design of Future Things, to be published in October. “It’s nice to think we’re breaking away from that and going toward touch-screen manipulation in the real physical world.”
Some researchers are even developing touchable displays that can touch back. The emerging technology that enables this is called haptics. (See “The Cutting Edge of Haptics.”) One type of haptics technology involves a surface that senses when it’s touched and then vibrates at various frequencies, depending on the placement of one’s fingers. This sort of technology could be useful for the touch keyboard on Apple’s iPhone, says Scott Klemmer, professor of computer science at Stanford University. “You wouldn’t get the tactile feel of real buttons, but [because of the vibrations] you can tell you’ve touched a real button.”