It’s October 1942. My company of British infantry has driven the Germans out of a small desert town – a choke point in the minefields laid down by Rommel’s Afrika Korps south of El Alamein – and is now fending off a counterattack. I’m on a rooftop sighting German tanks through my binoculars and shouting coordinates to the gunner at the 88-millimeter flak cannon we just captured. After the gunfights with dozens of Nazi soldiers it took to get here, it’s satisfying to watch from a safe distance as the stricken tanks burst into flame.
No, I’m not an actor on the set of a World War II film – but I might as well be. I’m playing Call of Duty 2 on Microsoft’s high-powered Xbox 360 gaming console, and I’m in a state of immersion – not just on a sensory level but, surprisingly, on an emotional one, too. It’s almost as if I were at the movies.
[For sample screen shots from Call of Duty 2 and another game for the Xbox 360, click here.]
That verisimilitude is what’s most notable about the newest generation of video games. For the better part of a century, the most effective way to envelop an audience and surprise, amuse, sadden, or horrify it has been to make a movie. Everyone who saw Steven Spielberg’s 1998 film Saving Private Ryan, for example, recalls the first 20 minutes – an unbearably vivid re-creation of the American landing at Omaha Beach on D-Day, filmed largely on handheld cameras to heighten the theatergoer’s sense of being present amid the gore and violence. Now imagine that you, not Spielberg, are in charge of the action – deciding where to run and whom to shoot at. That is what it’s like to play Call of Duty 2 – and that’s how close today’s games have come to true interactive cinema.
In fact, I think it’s time to scrap the term “video game,” which will forever reek of teen-filled arcades and Super Mario Bros., for a coinage more suggestive of the complex character-driven narratives, freely navigable environments, and very nearly photorealistic graphics that now define state-of-the-art titles. I suggest “cinegame.” The word acknowledges the grown-up appeal of games like Call of Duty 2 – and let’s face it, 62 percent of America’s roughly 147 million gamers are adults – as well as the fact that the impressive processing power of machines like the Xbox 360 is rapidly pushing these games across the technological boundary between cartoonishness and filmlike veracity.
In days past, no one would have thought of comparing video games to movies. Indeed, the first several generations of games made no attempt at realism. I’m old enough to remember playing Atari’s Pong at a friend’s house when I was in the third or fourth grade. The beauty of Pong was in its mathematical purity: table tennis reduced to its abstract essence. Of course, that was about all the electronics of the day could do. But as microprocessors have grown in power, game designers have gradually abandoned abstraction in favor of concrete, textured, three-dimensional virtual worlds that can serve as settings for true storytelling. And with the Xbox 360, they’ve reached an apotheosis.
The machine, which Microsoft launched last November as the successor to the five-year-old Xbox, looks like a typical beige-box PC on the outside. But inside there are three separate CPUs or “cores,” each running at 3.2 gigahertz (billions of clock cycles per second), compared to the single two- or three-gigahertz CPU inside the typical PC. That’s enough to generate 1,080 lines of resolution, meaning graphics look stunning even on high-definition TVs. All that power makes the Xbox 360 the current king of the video consoles – at least until Sony releases the PlayStation 3 later this year. (The PS3 will feature a new Sony-IBM-Toshiba chip, which will have nine cores and run at more than four gigahertz.)
Games for the Xbox 360 are not harder to complete than their predecessors, nor do they require better hand-eye coordination. Indeed, at high speed, Pong is fiendishly difficult. But Xbox 360 games give the player more to look at, think about, and feel.
In the case of Call of Duty 2, there’s the blood, smoke, and bullets, which strike with an impact you can feel through the Xbox’s vibrating controller. There are the moments of pure cinema: a soldier whose gaze follows the bombers flying overhead, a multistory factory that collapses into rubble in a cloud of dust and flame. There is an obsessive level of detail, such as the inlaid wood carvings on an upended desk in a pulverized building. But most of all, there’s the continual peril of combat as you and your fellow squad members try to kill Germans before they kill you. As you guide your character through the game’s immense 3-D environments, some impressive artificial-intelligence algorithms make your fellow soldiers follow (and sometimes lead), providing covering fire and shouted warnings about snipers and grenades. If you’re stupid enough to approach the Germans at close range, you’re on your own. But by watching your brothers-in-arms, you can eventually learn how to outmaneuver the enemy – or simply stay hidden.
In fact, though I’ve watched plenty of World War II movies, I don’t think I fully appreciated before playing this game that the most important thing in a soldier’s life is finding cover. Nor did I have sufficient understanding of the pandemonium and waste marking the Allied campaigns in Europe and Africa. It may sound trite, but it’s true: I think I have a better sense for this war from having played this video game.
As affecting as Call of Duty 2 may be, however, there is another game that shows off the Xbox 360’s capabilities even more grandly. It’s Project Gotham Racing 3, a Grand Prix-style automobile racing game set on the roads of London, Las Vegas, New York, Tokyo, and Germany’s famous Nurburgring. In videoland, objects are constructed from tiny polygons; the more polygons, the smoother and less jagged an object will appear. The designers of PGR3 used up to 105,000 polygons per race car, more than 10 times the number used in Project Gotham Racing 2 for the original Xbox. Add in layer upon layer of effects such as reflections, shadows, dust, and motion blur, and the result is flabbergasting. Replays and still images from PGR3 races are nearly indistinguishable from the real thing (see www.technologyreview.com/xbox360).
I do not mean to argue that realism alone makes a game worth playing or that all games that try to be cinematic are masterpieces. In recent years, it’s become common for studios to pepper their games with movielike “cut scenes” in an attempt to wrap human-interest stories around the actual game missions. Rockstar Games, creator of the controversial Grand Theft Auto series, is a leader in this area. Unfortunately, the writing and voice acting in most cut scenes are schlocky. As video game critic Clive Thompson wrote for Slate in early 2005,
These Hollywood flourishes are good for dazzling mainstream journalists and pundits. That’s because there’s still a weird anxiety about adults playing games. Most people still think that video games are sophomoric kid stuff; the ones that have a narrative and emulate the movies seem more serious and, well, mature. In fact, I think the truth is almost the opposite. The more video games become like movies, the worse they are as games.
Thompson would be quite right – if, that is, cut scenes were the only way to give a game sweep and drama. But that’s no longer the case. With hardware as fast as Microsoft’s, designers can build drama into the missions themselves. Call of Duty 2, for example, has no cut scenes; a few old newsreels suffice to explain the setting for each campaign. Anything more would get in the way, making players into passive lookers-on in a game that’s all about lifelike experiences.
Of course, even if I’ve convinced you that the Xbox 360 is the best thing since the Lumiere brothers patented the cinematographe in 1895, you may have trouble buying one. Manufacturing difficulties limited Microsoft’s production run to about 600,000 units between the machine’s November 22 launch and the end of the holiday season, according to market research firm NPD Group. That wasn’t nearly enough to satisfy the enormous demand for consumer electronics; by way of comparison, Apple sold 14 million iPods over the 2005 holidays. Xbox supplies were so low in January, when I was preparing to write this review, that Microsoft itself had run out: an apologetic person at the company’s public-relations firm explained to me that it might be several months before a loaner was available. So I resorted to eBay, where I found a man in Corvallis, OR, who was willing to sell his Xbox 360 core system (without accessories such as a hard drive and a second controller) for $499, a mere 60 percent markup over the retail price. Fortunately, production picked up after the holiday season was over, and Microsoft says it expects the shortage to ease by this summer.
Thirty-four years after Pong, video games are finally maturing from arcade-style tests of fine-motor skills into an independent art form. That lag time shouldn’t be surprising: it wasn’t until 1915, fully 20 years after the invention of motion pictures, that The Birth of a Nation set down the basic grammar of movie storytelling, and it was only in 1977, almost 30 years after the birth of network television, that Roots introduced the first art form truly unique to TV, the miniseries. Now that video games can credibly evoke emotion and borrow elements from movies and other media without slavishly imitating them, it’s time to welcome them into our museums, libraries, and living rooms.
Xbox 360 Core System
Call of Duty 2
Project Gotham Racing 3
Microsoft Game Studios, $49.99
Home page image by Tim Bower.
Wade Roush is a senior editor at Technology Review.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.