‘Web 2.0’ Will Die on October 1, 2012
Pundits have been predicting the death of the term “Web 2.0” since at least 2008, occasionally with a level of schadenfreude that renders their rants hilarious, in retrospect. (TechCrunch heads for the deadpool, anyone?) In 2009 TechCrunch updated this meme with some actual data, but let’s face it, we weren’t far enough past the peak to make a prediction with any skill.

But no more! Today I’m putting a stake in the ground: “Web 2.0” will die on October 1, 2012. To arrive at this date, I simply fit a straight line to the more or less linear decline in search volume for the trend since its outlier peak in 2007. And by “fit” I mean “drew in Photoshop.”
You can see below that extending the trend out to infinity sees it cross the x-axis almost exactly three-quarters of the way through 2012. That’s the point at which the term “Web 2.0” will have become so tired and worn out that even the PR professionals who refuse to drop it will finally get a clue.

Of course, “Web 2.0” might not really die. Like other terms with historical significance, its search volume could asymptotically approach zero.
What this says about the over-use of “2.0” as a suffix for every other noun / trend you can think of is unknown. Hopefully a time when no one uses the antecedent of this lazy marketing shorthand is also a time when it doesn’t make sense to say, for example, “Food 2.0.”
The alternative – that “2.0” becomes a permanent part of speech, like “all cooped up” or “behind the 8-ball,” is just too terrifying to contemplate.
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.