Skip to Content
Uncategorized

Better Touch Screens for Mobile Phones

Keypads on smooth touch screens are prone to errors, but new ways of providing tactile feedback could make them more accurate.

For all its designer appeal, Apple’s forthcoming iPhone is lacking. While the touch-screen interface looks beautiful, it will most likely suffer from the same drawback that plagues many other mobile gadgets with touch screens: no tactile feedback. A button on a flat, slick display simply doesn’t feel like a button, and as a result, people are prone to making errors with them.

Tactile touch screens: University of Glasgow researchers determined that people type more accurately on a touch screen when the phone vibrates to indicate that a button has been pressed correctly. The researchers affixed a high-end actuator to the back of a PDA (top two images). The researchers are also developing applications for multiple actuators on the phone (bottom two images). Eventually, similar, smaller actuators could be used inside mobile devices.

But within the next few years, those faux touch-screen buttons could feel more like real buttons, thanks to research at a handful of universities and companies that are investigating touch-based feedback from gadgets. By feeling a buzz when they press a button correctly, people become more accurate typists on touch-screen keyboards, says Stephen Brewster, professor of computing science at the University of Glasgow, in the United Kingdom. Brewster and his team have found that people err–mistype, double-press, or slip from one button to another–up to 25 percent less frequently when vibrations are used to let them know that they’ve pressed a button correctly.

“The basic thing we show,” Brewster says, “is that having tactile feedback makes [mobile devices] more useful and usable.” Without tactile feedback, he says, people are still going to have usability issues no matter how well the touch screen is designed to ignore extra touches or accidental taps. In addition to trying to get rid of errors, Brewster and his team are exploring how well different types of vibrations convey various kinds of information, such as the urgency of an e-mail.

Today, almost all phones have the ability to buzz when someone calls. But this kind of vibration amounts to an announcement. The idea of using vibrations as feedback is a relatively new one and comes from an emerging research area called haptics, technology that involves human-machine interactions based on touch. Haptics is being explored for a range of applications, from mobile-device feedback to remote surgery. (See “The Cutting Edge of Haptics.”)

A San Jose­based company called Immersion is using haptics in a number of applications, and it has already developed haptic technology for Samsung mobile phones on the market today. The basic idea behind Brewster’s research and the technology from Immersion is the same: when a button on a touch screen is pressed, actuators inside or on the phone vibrate. In both cases, the vibration doesn’t occur solely underneath the user’s finger. Instead, the whole phone vibrates, which effectively creates the sensation of a button below the pressing finger, says Brewster.

While Immersion’s technology currently uses the same, somewhat limited actuators that are employed when a call is received, Brewster’s team is using specialized, somewhat expensive actuators to explore how people respond to different types of vibrations. Research conducted by Eve Hoggan , a PhD student in Brewster’s group, has shown that people can distinguish among different types of vibrations. A vibration can feel rough or smooth depending on the shape of the electrical current used to power the actuator. In some cases, the shape of the current can create a smooth vibration (when the waveform is a sine wave), and in other cases, the current produces a rough vibration (when the waveform is a sawtooth wave).

Hoggan has found that people can recognize the difference in these vibrations 94 percent of the time. Moreover, people can distinguish 81 percent of the time among the vibrational frequencies of 6 hertz, 70 hertz, and 250 hertz. Knowing how people can recognize different types of vibrations could help expand the usefulness of a keyboard. For instance, if someone puts his or her finger on a touch-screen key, but it slips and the user lifts it from another key (which results in no character being entered), the feedback could be a rough vibration, indicating that the user has made an error. Moreover, different types of keys or functions could produce different types of vibrations.

Brewster and his team are exploring how the placement of the actuators in the phone can be useful. Brewster has placed four actuators on the edges of a phone, where a right-handed person would likely grip the gadget. He says that these actuators could be used, for instance, to denote the progress of a downloaded file: each actuator would vibrate in sequence, with each buzz less intense than the last, until the file has completely downloaded. Moreover, Brewster says, the actuators could be used to help people navigate. If the phone were equipped with global positioning sensors and the location of a destination were entered, the vibrations could help direct a person to the right or the left.

Danny Grant, vice president of research at Immersion, thinks that Brewster’s research will keep pushing the capabilities of haptic feedback in mobile devices. “One of the things that we’re hoping for with haptics is to convey information using different methods,” Grant says. “Already, the screen is small, and a touch screen is worse because a finger blocks it. Using the sense of touch and conveying more information that way is a big win. Any work that shows the range we can achieve in information transfer is great.”

Brewster suspects that within the next couple of years, vibrational feedback will be much more common, and people will be able to pick out the different types of vibrations they want to use on their phones, much as people now choose the wallpaper on their phone’s screen.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.