Point taken: The EyeRing captures an image and sends it to a smartphone for processing.
Normally, we point at things to specify, or to emphasize, what we’re talking about. But a project from several MIT researchers aims to make pointing a way to learn more about the world around you—with a special ring on your index finger and a smartphone in your pocket.
Called EyeRing, the finger-worn device allows you to point at an object, take a photo, and hear feedback about what it is you just focused on. The project is the brainchild of Pattie Maes, a professor in MIT’s Media Lab who studies interfaces that let us interact with digital information in novel, intuitive ways. Initially conceived as a potential aid for the visually impaired, the EyeRing could also work as a navigation or translation aid, or help children learn to read, say the researchers involved. The group is interested in eventually turning it into a commercial product.
As smartphones become increasingly common, the use of augmented reality—the blending of digital content with the real world—has also risen, mainly in the form of apps that harness the phone’s camera and sensors and use its screen as a window to a more data-rich world (see “Augmented Reality Is Finally Getting Real”).
The EyeRing takes this a step further by offering aural feedback via a wearable device. And while it’s still just a research project, some experts believe wearable electronics will eventually become common—an idea Google recently put in the spotlight by confirming it’s working on glasses that can show the wearer maps, messages, and more (see “You Will Want Google Goggles”).
The EyeRing, which is currently printed with plastic using a 3-D printer, includes a tiny camera, a processor, and Bluetooth connectivity. To use it, you double-click a little button on its side and speak a command to determine the ring’s function (it can currently be set to identify currency, text, prices on price tags, and colors). Point at whatever you’d like more information about—a shirt on a store rack, for instance—and click the button to snap a photo. The picture is sent via Bluetooth to your smartphone, where an app uses computer-vision algorithms to process the image and then announce out loud what it sees (“green,” for example, denoting the color of the shirt). The results are also shown on the smartphone’s screen.
“Not having to get your phone out of your pocket or purse and open it is a big advantage, we think,” Maes says.
So far, the researchers have gotten EyeRing working with a smartphone running Google’s Android software and with a Mac computer, says Roy Shilkrot, a graduate student in the Fluid Interfaces Group within MIT’s Media Lab who is working on the device with Maes. An iPhone app is also in the works. The group has performed tests of the EyeRing with visually impaired people.
Aapo Markkanen, an analyst with ABI Research, thinks finger-worn devices like the EyeRing could be useful, but he notes that any wearable device will face some of same issues that have hampered smartphones: limited processing power and battery life. And wearable technology faces the additional hurdle of needing to be comfortable enough for people to want to use it for extended periods of time. Markkanen expects it will be several years before this is the case.
Maes agrees that processing power and battery life are concerns, but thinks that in a few years, turning EyeRing into a commercial device will be “very doable.”
Shilkrot believes it could eventually be sold for under $100—perhaps as cheaply as $50. Still, he says, it would take several more iterations of the project before it could be useful to people. “We want to keep working on this and make it better,” he says. “Right now, we’re in the stage where we’re trying to prove it’s a viable solution.”