A researcher at Stanford has created an alternative to the mouse that allows a person using a computer to click links, highlight text, and scroll simply by looking at the screen and tapping a key on the keyboard. By using standard eye-tracking hardware–a specialized computer screen with a high-definition camera and infrared lights–Manu Kumar, a doctoral student who works with computer-science professor Terry Winograd, has developed a novel user interface that is easy to operate.
“Eye-tracking technology was developed for disabled users,” Kumar explains, “but the work that we’re doing here is trying to get it to a point where it becomes more useful for able-bodied users.” He says that nondisabled users tend to have a higher standard for easy-to-use interfaces, and previously, eye-tracking technology that disabled people use hasn’t appealed to them.
View a slide show of the GUIDe Interface
At the heart of Kumar’s technology is software called EyePoint that works with standard eye-tracking hardware. The software uses an approach that requires that a person look at a Web link, for instance, and hold a “hot key” on the keyboard (usually found on the number pad on the right) as she is looking. The area of the screen that’s being looked at becomes magnified. Then, the person pinpoints her focus within the magnified region and releases the hot key, effectively clicking through to the link.
Kumar’s approach could take eye-tracking user interfaces in the right direction. Instead of designing a common type of gaze-based interface that is controlled completely by the eyes–for instance, a system in which a user gazes at a given link, then blinks in order to click through–he has involved the hand, which makes the interaction more natural. “He’s got the right idea to let the eye augment the hand,” says Robert Jacob, professor of computer science at Tufts University, in Medford, MA.
Rudimentary eye-tracking technology dates back to the early 1900s. Using photographic film, researchers captured reflected light from subjects’ eyes and used the information to study how people read and look at pictures. But today’s technology involves a high-resolution camera and a series of infrared light-emitting diodes. This hardware is embedded into the bezel of expensive monitors; the one Kumar uses cost $25,000. The camera picks up the movement of the pupil and the reflection of the infrared light off the cornea, which is used as a reference point because it doesn’t move.
Even the best eye tracker isn’t perfect, however. “The eye is not really very stable,” says Kumar. Even when a person is fixated on a point, the pupil jitters. So he wrote an algorithm that allows the computer to smooth out the eye jitters in real time. The rest of the research, says Kumar, involves studying how people look at a screen and figuring out a way to build an interface that “does not overload the visual channel.” In other words, he wanted to make its use feel natural to the user.
One of the important features of the interface, says Kumar, is that it works without a person needing to control a cursor. Unlike the mouse-based system in ubiquitous use today, EyePoint provides no feedback on where a person is looking. Previous studies have shown that it is distracting to a person when she is aware of her gaze because she consciously tries to control its location. In the usability studies that Kumar conducted, he found that people’s performance dropped when he implemented a blue dot that followed their eyes.
In his studies of 20 people, he found that participants that needed to type and point could point faster using the gaze-based appraoch than using a mouse, although the error rate–20 percent–was fairly high. But overall, about 90 percent of participants reported that they preferred using EyePoint to the mouse.
It’s the 20 percent error rate that could cause some problems, says Ted Selker, professor at the MIT Media and Arts Technology Laboratory. “[It’s] a huge amount,” he says, “because a person can notice a significant decline in accuracy at just 5 percent.” Selker adds that the low accuracy could make text editing a challenge.
Kumar concedes that the system isn’t perfect, but he contends that many of the errors came from people, who due to lack of practice, clicked links that they thought they had looked at but were only in their peripheral vision. Indeed, he says, trackpads, trackpoints, trackballs do not perform as well as a mouse either but are still viable input devices. Kumar says he’s been working on algorithms that show promise for making EyePoint more accurate by accounting for peripheral vision related errors. Still, he allows that EyePoint might work poorly for certain people, such as those with thick glasses, special contact lenses, or lazy eyes.
Even so, Kumar is confident in the technology and its development as a tool for the general population. To that end, he has tested a number of different interface schemes, all under a project called Gaze-enhanced User Interface Design (GUIDe). Another application, called EyeExposé, is made for Apple’s OS X feature called Exposé, in which a person can hit the F11 key to miniaturize all open windows, then drag the mouse cursor to the window she wants to bring forward. With EyeExposé, the user can hit the F11 key, then bring forward a window of interest by tapping a keyboard key. Also, Kumar has modified the “scroll lock” key on a keyboard in an application called EyeScroll: as a person reads, the screen slowly reveals more text. In addition, Kumar is testing a modified version of the “page up” and “page down” keys. When a person reads to the bottom of a page, the software automatically scrolls down one page; in order to help a reader keep her place, the most recently looked at part of the screen is highlighted.
The important thing about the Stanford research, says Shumin Zhai, researcher at IBM Almaden Research Center in San Jose, CA and pioneer in the eye-tracking field, is that Kumar “has been working on making eye tracking practical for everyday tasks.” However, Zhai says that there may still be a barrier for the average person because she needs to go through a calibration process in which the software measures how quickly her eyes move.
There are some signs that eye-tracking technology could find its way to the consumer market soon. Apple’s desktops and laptops are now equipped with a built-in camera for videoconferencing. If a higher-resolution camera, infrared LEDs, and software were added, Apple’s machines would be able to support applications from the GUIDe project, says Kumar. If eye tracking proves appealing to the consumer, and the hardware costs drop to a reasonable range, eye-tracking interfaces could provide an alluring and entertaining alternative to the mouse or laptop track pad. “It’s almost like magic when it’s working,” says Tufts’s Jacob. “The sensation you get is that the computer’s reading your mind, and that’s really very powerful.”
Toronto wants to kill the smart city forever
The city wants to get right what Sidewalk Labs got so wrong.
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.