Early next year, Kuma Reality Games plans to launch a service that will allow players to re-enact contemporary news events. Kuma’s first product-centered on the war in Iraq-will brief players with information derived from real-world news reporting, and then allow them to play out missions based on actual troop deployments. On its Web site, Kuma claims that the new game “presents our soldiers’ acts of patriotism and bravery as never before possible.” And then, the kicker: “In a world being torn apart by international conflict, one thing is on everyone’s mind as they finish watching the nightly news: ‘Man, this would make a great game.’”
How dare we play-act the Iraq War when American G.I.s are still in harm’s way? Whatever views we may have about the war, surely we have more important things on our minds than whether or not it will make a kickass game!
That, at least, is my initial reaction. But, then, upon further reflection, I realized that, despite their inflammatory tone, these guys are right. Ever since the September 11 attacks, there has been a persistent desire to transform this conflict into a game-and an almost equally persistent distaste over the idea. This debate tells us a great deal about the ways our culture thinks about games-and the way we think about war.
Last year, a federal judge ruled that games did not enjoy First Amendment protection because they did not express ideas. This past summer, a higher court overruled that decision. The political importance of games has been demonstrated again and again as groups struggle over how-and whether-the Iraq War should be represented through games. The military uses games to recruit and train soldiers; the antiwar movement uses games to express the futility of the current conflict; the pro-war movement uses games to express its anger against the terrorists; the news media use games to explain military strategy; and the commercial games industry wants to test the waters to see if we are going to play war games the same way other generations watched war movies.
No sooner did the Bush administration identify Osama bin Laden as the likely culprit than a wave of amateur games popped up across the Internet, giving players the chance to maim and manhandle the terrorist leader. The Palestinian Liberation Organization created international controversy when it released a Web-based game, Under Ash, which it argued showed their perspective on the Middle Eastern conflict. The night the bombs fell on Baghdad, Sony trademarked “shock and awe” with the idea of using it as the title for a (since abandoned) Iraq War game. A few months later, the U.S. Department of Defense came under attack for developing a futures market where people could place bets on the likely location of the next terrorist attacks-a plan that was quickly scuttled once it was made public. The most recent controversy centers around a CIA proposal to develop a game, working with the University of Southern California-based Institute for Creative Technologies (ICT), which would allow operatives to “think outside the box” by adopting the role of a member of a terrorist cell.
In his 2001 essay, “Ephemeral Games: Is It Barbaric to Design Games After Auschwitz?” videogame designer and theorist Gonzalo Frasca argued that today’s games are an inappropriate medium for dealing with such serious matters. He cites two reasons: first, video games focus on winning and losing but not the deeper ethical implications of modern warfare. Second, games are infinitely reversible which makes it impossible for them to sustain a tragic tone or to deal with the real life consequences of such events. Ever since the essay appeared on Frasca’s Web site, many game designers (Frasca among them) have set out to prove him wrong-and in the process, to refine the language through which games express political ideas.
Frasca is not the only skeptic in this debate. Many feel that videogame depictions of war trivialize the real loss of life-though we celebrate films like Saving Private Ryan for immersing us in the experience of war. Do games like the World War II-themed Medal of Honor give us an equally powerful impression of what the battlefield is like? Perhaps this distaste merely reflects games’ recent invention and their low ranking on the cultural hierarchy.
If the idea of turning war into games is so intrinsically offensive, why has there been so little public outrage over the use of playing cards as a way of representing the search for and capture of Iraqi leaders? Is it right to deal with regime change the same way kids approach Pokemon-“gotta collect ‘em all”? The playing card interface suggests some of the deep historical links between war and games. Consider, for example, the ways that chess embodies the struggle between two warring kingdoms or the use of martial and gladiatorial imagery when we talk about football. We have used games to represent struggles over space and power for thousands of years.
As such examples suggest, it is not playing war per se that offends most of us. Three key variables shape our gut reactions to this concept: the mode of representation (the relative abstraction of the chess board as compared to the graphic realism of most games); temporality (historic simulations vs. the rawness of current events); and motive (a military training exercise, an antiwar statement, or a commercial exploitation).
All of this explains the public outrage that arises each time such games are proposed, but it doesn’t explain why such efforts keep re-emerging. For one thing, we use games to work through the intense anxieties surrounding modern warfare, to bring it at least momentarily under our symbolic control. This view was widely shared among child psychologists in the World War II era who encouraged kids to enact military conflicts and even sanctioned playing the role of the enemy as a way of feeling more control over their lives. (At the same time, governments often encourage role-play as a means of building public support for their war efforts-and there, timeliness is key. Many of the classic war movies (such as Air Force, Action in the North Atlantic, and Bataan) were released during WWII through a cooperation between the federal government and the film industry, often depicting sanitized versions of events that had occurred overseas only months earlier.
This sense of war play as a recruitment tool inspires America’s Army, an online first person shooter game produced by the U.S. military last year and distributed free to game players around the world; there is even some talk that the game will come bundled on many new computers, suggesting that the Pentagon has been taking lessons from Bill Gates. One controversial aspect of the game has been its effort to reduce the explicit representation of violence in order to earn the game a rating low enough to get it into the hands of most teens-a choice that critics argue distorts the actual consequences of modern warfare. The Pentagon worked hard to insure the game communicates military values, including mechanisms that reward players for honorable conduct and that impose severe punishment if you shoot your own troops.
War play isn’t some creepy idea coming from the fringes of our culture. Games and simulations are increasingly woven into the strategies by which the U.S. government prepares us for armed conflict. The ICT emerged as a site of industry and government collaboration in the design and deployment of games in the service of national security following the 1991 Gulf War, which Gen. Norman Schwartzkopf had characterized as the first “Nintendo war.” War critics argue that modern warfare distances participants from human loss and makes it fun to blast away villages. The military command, on the other hand, has embraced computer games as the ideal means of preparing the next generation of soldiers to deal with the high tech interfaces of modern fighting equipment. Kuma Games surely has the ICT in mind when it promises players access to “the same immersive, first- and third-person views used by the military in their own planning, training and analysis.”
One of my graduate students, Zhan Li, has done a thesis on the communities that have sprung up around America’s Army, even interviewing players as the first bombs were dropped on Baghdad. Most of the player said they went online to escape real world news, some said they could no longer take pleasure in the game while real soldiers were dying. A few saw playing the game as a way of mourning the losses of comrades. Veterans and current GIs are often critical of the casual and, well, playful attitude with which nonmilitary people play the game. Li’s research suggests that America’s Army may be less effective as a propaganda tool than as a vehicle through which civilians and service folk could discuss the serious experience of real life war.
The antiwar movement has found computer games to be an effective agitprop tool. Games may be to the Iraq War what underground comics were to Vietnam-a way to popularize countercultural messages by tapping into the popular culture. Frasca, in fact, is the primary architect for a game-entitled September 12-that challenges players to respond to a terrorist attack. You can target any of the buildings in an Arab village and blast them away with your warheads, but when you do, Moslem women weep over their dead children and more terrorists grab guns to defend their homes. Frasca built a game you can’t win; indeed, that is the message that it quickly communicates.
Another game, Blood of bin Laden, developed by artist Jason Huddy, plays upon the fact that in Afghanistan, land mines and food drops were wrapped in the same yellow paper. In this game, you have to move across a series of yellow squares, never certain whether they will blow you to bits or restore you to health. Huddy’s Web site explains that Blood of bin Laden “comes with an INSTANT gratification level that allows you to skip the war and head on into a room of defenseless Osamas.”
Or consider the case of Velvet-Strike, a hack developed by experimental artist Anne-Marie Schleiner, which involves spay painting virtual antiwar graffiti on the computer-generated walls, ceiling, and floors of a networked counter-terrorism themed game called Counter-Strike. The goal of Velvet-Strike is to protest the ways that war is trivialized in such spaces. Schleiner argues that while Counter-Strike promises graphical realism, it isn’t realistic enough in that it does not depict refugee camps, bombed hospitals, or maimed children. Some Counter-Strike players have accused Schleiner of being a digital terrorist for trespassing upon their game play.
Each game reflects different understandings of this war and its moral consequences. And each explores the potential of digital games as a vehicle for shaping public opinion. Given the divisiveness of current sentiments toward the war and the newness of games as a rhetorical medium, it is hardly surprising that these games offend some and disappoint others. Can you really make a kickass game about what has been a less than kickass war?
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.