MIT Technology Review Subscribe

Intel Puts Game Physics in the Cloud

Simulating the physics of light makes for games that better mimic the real world.

Carl Sagan once said that to make an apple pie from scratch you must first create the universe. The same principle applies when it comes to making computer graphics truly look like the real world: you need to start with the basic physics of how light travels through air and objects, bounces, and diffracts.

Doing that–using a technique called ray tracing–makes for the ultimate in realistic computer graphics and gaming. Unfortunately it is so computationally intensive to do in real time that examples so far are limited to research labs.

Advertisement

That may be about to change. At the Intel Developer Forum in San Francisco last night Intel researchers based in Germany showed me a non-descript laptop running a ray traced version of first person shooter Wolfenstein. By shunting the physics calculations necessary into the cloud–onto servers connected to over a network–they have made it possible for even puny machines to offer truly real graphics. When the user interacts with the game their commands are sent back to the cloud which calculates how its simulated universe has to change and sends the resulting frames back.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

The screenshot above gives an idea of how the look of a ray traced scene–in particular the play of light and shadow–looks more real. It looks really real when you stroll around in the game and shadows shift accordingly. A nice example was provided when the game character crouched next to a classic car polished to a high shine. The movement of other characters was reflected and distorted in the paintwork as it would in real life, complete with reflections of reflections where two surfaces met. This could make for much more complex gaming–sneaking up on other players would be much harder. This screenshot

Startups like OnLive have already shown that running a game in the cloud is possible. With powerful enough servers it should be possible to do that with a ray traced game too, Daniel Pohl of Intel told me, allowing even gamers with less than the most powerful hardware to experience truly real graphics. You can read and see more at Pohl’s webpage on the project, and this blog post.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement