Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

It’s October 1942. My company of British infantry has driven the Germans out of a small desert town – a choke point in the minefields laid down by Rommel’s Afrika Korps south of El Alamein – and is now fending off a counterattack. I’m on a rooftop sighting German tanks through my binoculars and shouting coordinates to the gunner at the 88-millimeter flak cannon we just captured. After the gunfights with dozens of Nazi soldiers it took to get here, it’s satisfying to watch from a safe distance as the stricken tanks burst into flame.

No, I’m not an actor on the set of a World War II film – but I might as well be. I’m playing Call of Duty 2 on Microsoft’s high-powered Xbox 360 gaming console, and I’m in a state of immersion – not just on a sensory level but, surprisingly, on an emotional one, too. It’s almost as if I were at the movies.

[For sample screen shots from Call of Duty 2 and another game for the Xbox 360, click here.]

That verisimilitude is what’s most notable about the newest generation of video games. For the better part of a century, the most effective way to envelop an audience and surprise, amuse, sadden, or horrify it has been to make a movie. Everyone who saw Steven Spielberg’s 1998 film Saving Private Ryan, for example, recalls the first 20 minutes – an unbearably vivid re-creation of the American landing at Omaha Beach on D-Day, filmed largely on handheld cameras to heighten the theatergoer’s sense of being present amid the gore and violence. Now imagine that you, not Spielberg, are in charge of the action – deciding where to run and whom to shoot at. That is what it’s like to play Call of Duty 2 – and that’s how close today’s games have come to true interactive cinema.

In fact, I think it’s time to scrap the term “video game,” which will forever reek of teen-filled arcades and Super Mario Bros., for a coinage more suggestive of the complex character-driven narratives, freely navigable environments, and very nearly photorealistic graphics that now define state-of-the-art titles. I suggest “cinegame.” The word acknowledges the grown-up appeal of games like Call of Duty 2 – and let’s face it, 62 percent of America’s roughly 147 million gamers are adults – as well as the fact that the impressive processing power of machines like the Xbox 360 is rapidly pushing these games across the technological boundary between cartoonishness and filmlike veracity.

In days past, no one would have thought of comparing video games to movies. Indeed, the first several generations of games made no attempt at realism. I’m old enough to remember playing Atari’s Pong at a friend’s house when I was in the third or fourth grade. The beauty of Pong was in its mathematical purity: table tennis reduced to its abstract essence. Of course, that was about all the electronics of the day could do. But as microprocessors have grown in power, game designers have gradually abandoned abstraction in favor of concrete, textured, three-dimensional virtual worlds that can serve as settings for true storytelling. And with the Xbox 360, they’ve reached an apotheosis.

5 comments. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me