Predicting the Post PC-Era 20 Years Ago
“When we were an agrarian nation, all cars were trucks. But as people moved more towards urban centers, people started to get into cars. I think PCs are going to be like trucks.”
A certain famous CEO said this in June of 2010. Since then, America’s love affair with the smartphone and tablet has grown unabated.
But by no means was Steve Jobs the first to “call” the death of the PC. An interesting post from tech commentator Robert X. Cringely reveals how he predicted the death of the personal computer back in 1992, in his book Accidental Empires. He called the death for right about… now.
His logic, based on history, was that transformative information technologies take 30 years to essentially be digested by society. It took three decades before moveable type led to books. It took three decades before telephones truly began to permeate and transform our lives. Similarly, film was born in the last years of the 19th century but only took off in the 1920s, and TV was invented in the 1920s but didn’t really take off until the 1950s.
Placing the invention of the personal computer somewhere in the mid-70s, Cringely suggested that PCs should have reached their transformative inflection point by now. “I was obviously a little off in my timing. But only a little off,” he writes. The PC and smartphone industries are neck and neck at $250 billion–but the latter is growing far faster, he points out. “How long before the PC as we knew it is dead?” he writes. “About five years I reckon…Nearly all of us are on our next-to-last PC.”
Do you agree? To a certain extent I never know how to judge such pronouncements of the post-PC era. Because I’m a writer, I need the processing power, keyboard, and screen real estate provided by my MacBook (and there are times when I even long for a desktop setup). And yet all around me I see people increasingly satisfied to do most or all of their work and communication via mobile devices like smartphones and tablets.
For all that, though, I’m still inclined to agree with Inc’s Renee Oricchio, for instance, when she says that reports of the PC’s death have been greatly exaggerated. “More like a ‘multiple device’ era,” she writes (granted, that doesn’t have the same ring to it). Microsoft’s COO Kevin Turner sees this as being a “PC+” era, reports the Verge. In this, he actually echoes a 1999 Bill Gates screen in Newsweek in which he explained “Why the PC Will Not Die.”
I’m sympathetic to the argument that mobile processing power may someday grow so powerful that I decide my iPhone 12 is the only computing device I need. I will carry it in my pocket, and around my house, I will dock it in several different ways, depending on what room, or mode, or mood I’m in. At my desk I’ll have an external monitor and physical keyboard that will serve as the input and outputs for my phone. When I’m in my living room, I’ll manage to use it to beam content to my TV.
In what meaningful sense, though, will this not be a “personal computer”? It will be a computer that I carry literally on my person, and that I rely on in all the same ways I rely on a variety of computing devices now. As Gates declared over a decades ago, “the PC will morph into many new forms, such as book-size ‘tablet PCs.’ But they’ll still be PCs underneath, with all the benefits of the universal PC model.”
The only thing that’s certain in this business is that most predictions are bound to be wrong. Nonetheless, what do you think? Is the PC’s death genuinely imminent?
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.