Skip to Content

Control Your Smartphone with Your Eyes

Researchers are making mobile software that could let you rely on eye movements to play games (or do other things).

In an effort to make eye tracking cheap, compact, and accurate enough to be included in smartphones, a group of researchers is crowdsourcing the collection of gaze information and using it to teach mobile software how to figure out where you’re looking in real time.

Researchers at MIT, the University of Georgia, and Germany’s Max Planck Institute for Informatics are working on the project, and they say that so far they’ve been able to train software to identify where a person is looking with an accuracy of about a centimeter on a mobile phone and 1.7 centimeters on a tablet.

This isn’t all that accurate, especially if you consider the overall size of your smartphone screen. It’s still not exact enough to use for consumer applications, says Aditya Khosla, a graduate student at MIT and coauthor of a paper on the work that was presented at a computer vision conference this week.

To help them make eye-tracking software that runs on your smartphone, researchers first created an app that collects users’ gaze data.

But he believes the system’s accuracy will improve with more data. If it does, it could make eye tracking a lot more widespread and useful; typically, it’s been expensive and has required hardware that has made it tricky to add the capability to gadgets like phones and tablets. And the technology could be helpful as a way to let you play games or navigate your smartphone without having to tap or swipe.

The researchers started out by building an iPhone app called GazeCapture that gathered data about how people look at their phones in different environments outside the confines of a lab. Users’ gaze was recorded with the phone’s front camera as they were shown pulsating dots on a smartphone screen. To make sure they were paying attention, they were then shown a dot with an “L” or “R” inside it, and they had to tap the left or ride side of the screen in response.

GazeCapture information was then used to train software called iTracker, which can also run on an iPhone. The handset’s camera captures your face, and the software considers factors like the position and direction of your head and eyes to figure out where your gaze is focused on the screen.

About 1,500 people have used the GazeCapture app so far, Khosla says, and he thinks that if the researchers can get data from 10,000 people they’ll be able to reduce iTracker’s error rate to half a centimeter, which should be good enough for a range of eye-tracking applications.

Khosla is hoping it could be used for medical diagnoses in particular; some studies have considered how eye movements might be used to diagnose conditions including schizophrenia and concussions.

Andrew Duchowski, a professor at Clemson University who studies eye tracking, thinks iTracker could be “hugely” useful if the researchers can get it working well on mobile devices, though he cautions that it will also need to work quickly and not consume too much battery life.

He doesn’t think it’s possible to get pixel-level accuracy from it, but he says “it could still be pretty good.”


Keep Reading

Most Popular

A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?

Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.

A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate

Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.

10 Breakthrough Technologies 2023

Every year, we pick the 10 technologies that matter the most right now. We look for advances that will have a big impact on our lives and break down why they matter.

These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway

Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.