Touch Screens with Pop-up Buttons
Touch-screen technology has become wildly popular, thanks to smart phones designed for nimble fingers. But most touch screens have a major drawback: you need to keep a close eye on the screen as you tap, to make sure that you hit the right virtual buttons. As touch screens become more popular in other contexts, such as in-car navigation and entertainment systems, this lack of sensory feedback could become a dangerous distraction.
Now researchers at Carnegie Mellon University have developed buttons that pop out from a touch-screen surface. The design retains the dynamic display capabilities of a normal touch screen but can also produce tactile buttons for certain functions.
Graduate student Chris Harrison and computer-science professor Scott Hudson have built a handful of proof-of-concept displays with the morphing buttons. The screens are covered in semitransparent latex, which sits on top of an acrylic plate with shaped holes and an air chamber connected to a pump. When the pump is off, the screen is flat; when it’s switched on, the latex forms concave or convex features around the cutouts, depending on negative or positive pressure.
To illuminate the screens and give them multitouch capabilities, the researchers use projectors, infrared light, and cameras positioned below the surface. The projectors cast images onto the screens while the cameras sense infrared light scattered by fingers at the surface.
The idea of physically dynamic interfaces isn’t new, and in recent years, researchers have explored using screens made from polymers that can alter their shape when exposed to heat, light, and changes in a magnetic field. However, these materials are still experimental and relatively expensive to make.
Simpler systems, such as those that use a flexible material like latex and a pneumatic pump, have also been explored by researchers in the past. However, these systems haven’t had all the capabilities of the Carnegie Mellon project, Harrison says. He explains that the display is the first to combine moving parts (the pop-up buttons), display dynamic information, and be touch sensitive. Other projects and products usually achieve two of these three criteria, he says.
“Microsoft Surface does graphics, and you can touch it, but it’s totally fixed,” Harrison says. “Buttons on a dashboard have great tactile input, but there’s no display. And it’s not like you can just deform an LCD screen and … make it electrically conductive at the same time.”
Because the system is pressurized, the pressure information can itself be used as an input, Harrison says. For example, if the screen were used to control an MP3 player, a person could press a button harder to scan through radio stations or songs faster. While many touch-screen displays can also register different levels of pressure, the glass or rigid plastic used doesn’t provide any tactile feedback.
Rob Miller, a professor of electrical engineering and computer science at MIT, says that this type of interface is particularly likely to find its way into car dashboards. “When you’re driving a car, you’re situationally impaired,” he says. “Your eyes need to be on the road, not hunting for the right button and watching whether you pressed it right.”
In a small user study involving the Carnegie Mellon display, testers found the pneumatic buttons as easy to use as static ones while taking a simulated driving test. They also glanced at the pneumatic buttons only as often as they glanced at the physical buttons.
Due to its pneumatic nature, the system is currently fairly large, but Harrison says that he is trying to find ways to shrink it. “You can’t get a pump inside a cell phone,” he says, “but one possibility is to have a balloon and squeeze it using a conventional motor.”
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.