MIT Technology Review Subscribe

Control Your Smartphone with Your Eyes

Researchers are making mobile software that could let you rely on eye movements to play games (or do other things).

In an effort to make eye tracking cheap, compact, and accurate enough to be included in smartphones, a group of researchers is crowdsourcing the collection of gaze information and using it to teach mobile software how to figure out where you’re looking in real time.

Researchers at MIT, the University of Georgia, and Germany’s Max Planck Institute for Informatics are working on the project, and they say that so far they’ve been able to train software to identify where a person is looking with an accuracy of about a centimeter on a mobile phone and 1.7 centimeters on a tablet.

Advertisement

This isn’t all that accurate, especially if you consider the overall size of your smartphone screen. It’s still not exact enough to use for consumer applications, says Aditya Khosla, a graduate student at MIT and coauthor of a paper on the work that was presented at a computer vision conference this week.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in
To help them make eye-tracking software that runs on your smartphone, researchers first created an app that collects users’ gaze data.

But he believes the system’s accuracy will improve with more data. If it does, it could make eye tracking a lot more widespread and useful; typically, it’s been expensive and has required hardware that has made it tricky to add the capability to gadgets like phones and tablets. And the technology could be helpful as a way to let you play games or navigate your smartphone without having to tap or swipe.

The researchers started out by building an iPhone app called GazeCapture that gathered data about how people look at their phones in different environments outside the confines of a lab. Users’ gaze was recorded with the phone’s front camera as they were shown pulsating dots on a smartphone screen. To make sure they were paying attention, they were then shown a dot with an “L” or “R” inside it, and they had to tap the left or ride side of the screen in response.

GazeCapture information was then used to train software called iTracker, which can also run on an iPhone. The handset’s camera captures your face, and the software considers factors like the position and direction of your head and eyes to figure out where your gaze is focused on the screen.

About 1,500 people have used the GazeCapture app so far, Khosla says, and he thinks that if the researchers can get data from 10,000 people they’ll be able to reduce iTracker’s error rate to half a centimeter, which should be good enough for a range of eye-tracking applications.

Khosla is hoping it could be used for medical diagnoses in particular; some studies have considered how eye movements might be used to diagnose conditions including schizophrenia and concussions.

Andrew Duchowski, a professor at Clemson University who studies eye tracking, thinks iTracker could be “hugely” useful if the researchers can get it working well on mobile devices, though he cautions that it will also need to work quickly and not consume too much battery life.

He doesn’t think it’s possible to get pixel-level accuracy from it, but he says “it could still be pretty good.”

Advertisement

 

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement