Video games are now used in many college courses – from computer science to cultural studies.
The college students glued to video game consoles today are as likely to be scholars as slackers. More than 100 colleges and universities in North America – up from less than a dozen five years ago – now offer some form of “video game studies,” ranging from hard-core computer science to prepare students for game-making careers to critiques of games as cultural artifacts.
Recognition by the academy marks a coming of age for gaming. “When the School of Cinema-Television was founded 75 years ago, many people still considered film nothing but simplistic entertainment – a medium that could never be considered important artistically,” says the University of Southern California’s Scott Fisher, referring to USC’s famed film school. “Games are considered by many people today in the same way. But the next generation of game designers has the potential to change that.”
The interactive-media division at USC, which Fisher chairs, offers bachelor’s and master’s degrees in interactive media, with courses like game history and theoretical tools for creating games.
Randy Pausch, codirector of the Entertainment Technology Center at Carnegie Mellon University – which offers a master’s degree in entertainment technology – adds that gaming studies have a sneaky side: they attract students to computer science.
Meanwhile, on the lit-crit front, some scholars have come up with a fancy name for their discipline: ludology, from the Latin ludus (game). Topics range from game philology to the study of virtual economies in EverQuest.
Academic video-game departments are also cranking out workers for hundreds of video game studios. “The school system can turn out our worker bees,” says Jason Della Rocca, executive director of the International Game Developers Association.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today