Skip to Content

GM Develops Augmented Reality Windshield

The display outlines the road, and pinpoints obstacles, people and signs, even in bad weather.

A new “enhanced vision system” from General Motors could help drivers by highlighting landmarks, obstacles and road edges on the windshield in real-time. Such a system can point out to drivers potential hazards, such as a running animal, even in foggy or dark conditions, GM says.

Head-up displays (HUDs) are already used to project some information–like a car’s speed or directions–directly in front of the driver, through the windshield, or even through a side view mirror. These sorts of displays have started appearing in high-end cars, and typically work by projecting light to create an image on part of the windshield.

To turn the entire windshield into a transparent display, GM uses a special type of glass coated with red-emitting and blue-emitting phosphors–a clear synthetic material that glows when it is excited by ultraviolet light. The phosphor display, created by SuperImaging, is activated by tiny, ultraviolet lasers bouncing off mirrors bundled near the windshield. Three cameras track a driver’s head and eyes to determine where she is looking.

“We definitely don’t want the virtual image that’s on the display to complete with the external world; we just want to augment it,” says Thomas Seder, the lab group manager for the Human Machine Interface group at GM.

The new display, which so far has only been tested in simulations, wouldn’t be incorporated into cars until 2018 at the earliest, says Seder. The team hopes to pair the technology with night vision and find a way to combine the work with other sensors in the car to keep costs down, he adds.

“I’d like to couple with other systems and not have it be a standalone. That will help cost reduce it dramatically,” says Seder.

See how the system, which was developed with partners from Carnegie Mellon University and the University of Southern California, works in the video below.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.