Skip to Content
Computing

Nvidia is using AI in chips to make video games even more realistic

August 20, 2018

Artificial intelligence is helping to crack the challenge of creating lifelike light and shadow effects in games.

The news: Nvidia has just taken the wraps off new graphics chips that offer a significant improvement in “ray tracing,” or the process of representing light as pixels and accurately rendering what happens when it encounters virtual objects. (For an example, see the plane in the picture above from the game Battlefield V.) Generating lifelike reflections, refractions, and other effects requires considerable computing power, which has held ray tracing back in consumer gaming.

AI speed up: The existing approach to tackling light and shadows involves building an image from back to front, overlaying multiple elements to achieve the finished picture. Nvidia’s new chips, which are based on a recently updated architecture known as Turing, speed things up by rendering part of an image using this existing approach and then tapping artificial intelligence to predict and fill in the remaining light effects. 

Crypto slowdown: This real-time ray tracing capability should help juice sales of Nvidia’s graphics chips for gaming, and that could be excellent timing for the company. Turbulence in cryptocurrency markets has hit demand for its GPU chips, which until recently were being snapped up in large numbers by miners of Bitcoin and other virtual currencies. 

Deep Dive

Computing

Inside the hunt for new physics at the world’s largest particle collider

The Large Hadron Collider hasn’t seen any new particles since the discovery of the Higgs boson in 2012. Here’s what researchers are trying to do about it.

Why China is betting big on chiplets

By connecting several less-advanced chips into one, Chinese companies could circumvent the sanctions set by the US government.

How Wi-Fi sensing became usable tech

After a decade of obscurity, the technology is being used to track people’s movements.

Algorithms are everywhere

Three new books warn against turning into the person the algorithm thinks you are.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.