It’s Doom’s Day.
Doom III, the third computer game in id Software’s legendary first person shooter franchise, is finally on shelves. The game’s arrival – a midnight release party at computer stores around the country, preceded by a now customary pirated leak online – brings echoes of last decade’s Doom mania.
In 1993, the first Doom crashed computer networks from the University of Wisconsin to Intel after it was released as shareware online. The next year, Doom II shredded computer game sales charts and became the first video game in history to bear a voluntary rating for violence. As I detail in my book Masters of Doom (now in paperback!), id Software’s games pushed and/or pioneered stuff – fast action first person graphics, unabashed gore, multiplayer deathmatching, user-made modifications – that we take for granted today.
Will Doom III become have such an impact? No way. But that’s not necessarily the fault of the game, which looks amazing and injects a fresh stab at story. The old revolutions (see last paragraph) were won a long time ago. The next revolution in gaming will deliver something that, like the original Doom, makes us look at this medium in a completely new way. Cinematic stories and ultra-realistic eye candy, while cool, are not a paradigm. Games aren’t supposed to be chasing movies. They’re supposed to be doing things movies can’t. There are certain innovations - webcams, Geocaching, the Sony EyeToy, the Nintendo DS, the upcoming Xbox game Fable – that point toward what this future might resemble: a place where gaming takes on a completely different form, something that truly infiltrates our lives in ways deeper than even the greatest new graphics engine can render. And, like Doom, that place will be one to remember.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.