The screens on many mobile phones can leave a user feeling distinctly vision impaired, especially if her attention is divided between tapping virtual buttons and walking or driving. Fortunately, engineers at Google are experimenting with interfaces for Android-powered mobile phones that require no visual attention at all. At Google I/O, the company’s annual developer conference held in San Francisco last week, T.V. Raman, a research scientist at Google, demonstrated an adaptive, circular interface for phones that provides audio and tactile feedback.
“We are building a user interface that goes over and beyond the screen,” says Raman. Often, eyes-free interfaces are employed for blind users, but Raman, who himself is blind, assures that these interfaces have much broader implications. “This is not just about the blind user,” he says. “This is about how to use these devices if you’re not in a position to look at the machine.”
Eyes-free interfaces aren’t new. In fact, in 1994, Bill Buxton, a researcher at Microsoft, explored the idea of marking menus–round menus that were meant to be easier to use without the benefit of looking than a pull-down list. In recent years, Patrick Baudisch, another Microsoft researcher, who is also a professor at the Hasso Plattner Institute, in Germany, has applied the approach to MP3 menus that also provide audio feedback.
Some mobile phones already support vibrational feedback, but for the most part, gadget interfaces require intensive visual attention. According to Google’s Raman, Android could be one of the first phone platforms to enable a broad range of eyes-free interfaces. The Android platform supports vibrational and audio feedback, and at the conference, Raman and his colleague Charles Chen demonstrated that an eyes-free alternative can be added to almost any Android application with just a few lines of code.
The researchers showed off their interface as a way to dial numbers and search through contacts on a phone. One problem with most graphical user interfaces, says Raman, is that the buttons are in a fixed location, which is inconvenient if you can’t feel them. To address this problem, his interface appears as soon as a finger touches the screen, so that it is centered on this initial touch.
Hear more from Google at EmTech 2014.