A camera over the screen is a standard feature for laptops. But only Lenovo’s new model has a pair of cameras below its display to track the movements of a user’s eyes.
The prototype laptop can be controlled with eye motions, reducing the need to use the mouse and making it faster to navigate through information such as maps or menus.
The laptop can notice when its user has read to near the bottom of a page and can automatically scroll down to reveal more text. The same trick also makes it possible to browse through an e-mail in-box without using the mouse at all. When using a map application, the user can zoom in on an area by looking at it and scrolling the mouse wheel. The computer also dims its screen automatically to save power when it detects that the user’s gaze has left the screen.
“We’re attempting to make the process of interacting with your computer a more natural experience,” says Barbara Barclay, who heads North American operations for Tobii, the Swedish company that supplied the eye-tracking hardware and software for the prototype. So far, only 20 of the new computers have been made; Tobii and Lenovo will each have 10 with which to test out new ideas.
The two cameras below the laptop’s screen use infrared light to track a user’s pupils. An infrared light source located next to the cameras lights up the user’s face and creates a “glint” in the eyes that can be accurately tracked. The position of those points is used to create a 3-D model of the eyes that is used to calculate what part of the screen the user is looking at; the information is updated 40 times per second.
The system can accurately track the direction of the user’s gaze to about 0.5 degrees, which translates to about half an inch on the screen of the laptop. A user can shift position, says Barclay, but the head must be kept within a volume of roughly two cubic feet. Because the hardware is mounted and moves with the laptop’s LCD screen, a user’s efforts to accommodate the display’s limited viewing angle by adjusting head and monitor position usually ensure that the eyes remain in proper camera range.
Tobii’s eye-tracking technology has been used for years in academic research, in specialized products for people unable to use conventional computer interfaces, and by designers of everything from websites to product packaging. “We built this conceptual prototype to see how close we are to being ready to use eye tracking for the mass market,” says Barclay. “We think it may be ready.”
Manu Kumar, who worked on eye-tracking techniques at Stanford University and now runs the seed-stage venture capital fund K9 Ventures, says that such technology, if well designed, has a lot to offer most computer users. “To computers, humans are really just a big finger; everything is based around that mode of input,” he says. “Using eye tracking increases the bandwidth between the human and the computer.”
But to be successful, a system must use eye gaze the way humans do when interacting face to face, says Kumar: to understand a person’s intention, and not simply as a new way to drive a cursor. “When I press Page Down to scroll text today, the computer has no idea where I had got to and often makes me lose my place,” he says. “When eye gaze is used as an augmented input, you can do things in a more efficient manner.”
Kumar created and tested a feature similar to Tobii’s that automatically scrolled when a person reached the end of a page. To help the user continue scrolling, his version faded out the text that had already been read; it proved popular in user trials, he says. A mapping app that used gaze to zoom was less successful, though, because if the system misjudged the user’s eye position by even a little, the error was magnified.
Tobii and Lenovo are likely to find many ways consumers could use eye-tracking technology, but they will still have to face economic realities. “The key question is, what does it cost?” says Kumar. “I think that it will need to ship at very high volume—likely millions of units—for the hardware to be cheap enough for consumer laptops.”