In his studies of 20 people, he found that participants that needed to type and point could point faster using the gaze-based appraoch than using a mouse, although the error rate–20 percent–was fairly high. But overall, about 90 percent of participants reported that they preferred using EyePoint to the mouse.
It’s the 20 percent error rate that could cause some problems, says Ted Selker, professor at the MIT Media and Arts Technology Laboratory. “[It’s] a huge amount,” he says, “because a person can notice a significant decline in accuracy at just 5 percent.” Selker adds that the low accuracy could make text editing a challenge.
Kumar concedes that the system isn’t perfect, but he contends that many of the errors came from people, who due to lack of practice, clicked links that they thought they had looked at but were only in their peripheral vision. Indeed, he says, trackpads, trackpoints, trackballs do not perform as well as a mouse either but are still viable input devices. Kumar says he’s been working on algorithms that show promise for making EyePoint more accurate by accounting for peripheral vision related errors. Still, he allows that EyePoint might work poorly for certain people, such as those with thick glasses, special contact lenses, or lazy eyes.
Even so, Kumar is confident in the technology and its development as a tool for the general population. To that end, he has tested a number of different interface schemes, all under a project called Gaze-enhanced User Interface Design (GUIDe). Another application, called EyeExposé, is made for Apple’s OS X feature called Exposé, in which a person can hit the F11 key to miniaturize all open windows, then drag the mouse cursor to the window she wants to bring forward. With EyeExposé, the user can hit the F11 key, then bring forward a window of interest by tapping a keyboard key. Also, Kumar has modified the “scroll lock” key on a keyboard in an application called EyeScroll: as a person reads, the screen slowly reveals more text. In addition, Kumar is testing a modified version of the “page up” and “page down” keys. When a person reads to the bottom of a page, the software automatically scrolls down one page; in order to help a reader keep her place, the most recently looked at part of the screen is highlighted.
The important thing about the Stanford research, says Shumin Zhai, researcher at IBM Almaden Research Center in San Jose, CA and pioneer in the eye-tracking field, is that Kumar “has been working on making eye tracking practical for everyday tasks.” However, Zhai says that there may still be a barrier for the average person because she needs to go through a calibration process in which the software measures how quickly her eyes move.
There are some signs that eye-tracking technology could find its way to the consumer market soon. Apple’s desktops and laptops are now equipped with a built-in camera for videoconferencing. If a higher-resolution camera, infrared LEDs, and software were added, Apple’s machines would be able to support applications from the GUIDe project, says Kumar. If eye tracking proves appealing to the consumer, and the hardware costs drop to a reasonable range, eye-tracking interfaces could provide an alluring and entertaining alternative to the mouse or laptop track pad. “It’s almost like magic when it’s working,” says Tufts’s Jacob. “The sensation you get is that the computer’s reading your mind, and that’s really very powerful.”
Smaller design teams can now prototype and deploy faster.