For decades, engineers have envisioned wearable displays for pilots, surgeons, and mechanics. But so far, a compact wearable display that’s easy to interact with has proved elusive.
Researchers at Fraunhofer Institute for Photonic Microsystems (IPMS) have now developed a screen technology that could help make wearable displays more compact and simpler to use. By interlacing photodetector cells–similar to those used to capture light in a camera–with display pixels, the researchers have built a system that can display a moving image while also detecting movement directly in front of it. Tracking a person’s eye movements while she looks at the screen could allow for eye-tracking control: instead of using hand controls or another form of input, a user could flip through menu options on a screen by looking at the right part of the screen. The researchers envisage eventually integrating the screen with an augmented-reality system.
“We can present an image and, at the same time, track the movement of the user’s eye,” says Michael Scholles, business unit manager at Fraunhofer’s IPMS. “This is of great interest for all kinds of applications where your hands are needed for something else, like a pilot flying an aircraft or a surgeon wanting to access vital parameters while performing a surgery.”
Eye-tracking technology is nothing new, of course. Over the years, researchers have developed a number of systems that follow a person’s gaze to allow him or her to interface with a computer. Often, the applications are for physically impaired people, but they can also be designed for a general computer user.
Additionally, researchers have been developing wearable display systems for years, but for the most part, these have been clunky, power hungry, and not entirely practical to use, says Alexander Sawchuk, a professor of electrical engineering at the University of Southern California. “Anything that can be done to make [wearable displays] more compact or lighter weight and low power is important,” he says. And integrating a display and a camera on one chip is a step toward this, he says.
The researchers built the system by first designing a light-sensing chip, which features a pattern of evenly spaced photodetectors. This was then fabricated at a commercial semiconductor manufacturing facility. A wafer containing multiple chips was then placed in a deposition chamber, where layers of organic material were deposited in between the photodetectors. These layers make up the organic light-emitting diodes, or OLEDs, that create the display. The mosaic of photodetectors and OLEDs is then encapsulated in a thin polymer film to protect it.
The idea of integrating OLEDs with a photodetector chip is intriguing, says Sawchuk. “There are a lot of challenges in building wearable displays for the applications [intended by the researchers], and any advances in this field are very exciting,” he says.
The Fraunhofer IPMS researchers will demonstrate their prototype at the Society for Information Display conference in San Antonio this week. The current version touts a simple monochromatic display–about 1.25 centimeters on each side, with a resolution of 320 by 240 pixels. Scholles says that full-color displays are possible but trickier to create because they require adding color filters to white OLEDs, which are difficult to make efficiently and aren’t always reliable. However, the team at Fraunhofer IPMS has partnered with Novaled, an OLED company that manufactures high-quality white diodes, and plans to make future color prototypes using the company’s diodes.
The camera in the researchers’ current prototype is still fairly rudimentary. It has a resolution of only 12 pixels, which means that it can’t yet track a user’s eye movements. However, Scholles says that the team has developed a 160-by-120-resolution version of the camera chip that has been tested in the lab, but not yet integrated with a display. The researchers expect to have an advanced version of the system, complete with higher-resolution camera and full eye-tracking capability, by early 2011.