Skip to Content

GM Develops Augmented Reality Windshield

The display outlines the road, and pinpoints obstacles, people and signs, even in bad weather.

A new “enhanced vision system” from General Motors could help drivers by highlighting landmarks, obstacles and road edges on the windshield in real-time. Such a system can point out to drivers potential hazards, such as a running animal, even in foggy or dark conditions, GM says.

Head-up displays (HUDs) are already used to project some information–like a car’s speed or directions–directly in front of the driver, through the windshield, or even through a side view mirror. These sorts of displays have started appearing in high-end cars, and typically work by projecting light to create an image on part of the windshield.

To turn the entire windshield into a transparent display, GM uses a special type of glass coated with red-emitting and blue-emitting phosphors–a clear synthetic material that glows when it is excited by ultraviolet light. The phosphor display, created by SuperImaging, is activated by tiny, ultraviolet lasers bouncing off mirrors bundled near the windshield. Three cameras track a driver’s head and eyes to determine where she is looking.

“We definitely don’t want the virtual image that’s on the display to complete with the external world; we just want to augment it,” says Thomas Seder, the lab group manager for the Human Machine Interface group at GM.

The new display, which so far has only been tested in simulations, wouldn’t be incorporated into cars until 2018 at the earliest, says Seder. The team hopes to pair the technology with night vision and find a way to combine the work with other sensors in the car to keep costs down, he adds.

“I’d like to couple with other systems and not have it be a standalone. That will help cost reduce it dramatically,” says Seder.

See how the system, which was developed with partners from Carnegie Mellon University and the University of Southern California, works in the video below.

Keep Reading

Most Popular

still from Embodied Intelligence video
still from Embodied Intelligence video

These weird virtual creatures evolve their bodies to solve problems

They show how intelligence and body plans are closely linked—and could unlock AI for robots.

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

protein structures
protein structures

DeepMind says it will release the structure of every protein known to science

The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.