When did disruption become the overwhelming fact of business? It wasn’t always so. But the most admired businesses of the last 30 years have been technology companies or industrial companies that invested heavily in research and development, whose competitive advantage was their capacity to commercialize disruptive innovations or resist the innovations of other entities.
The common perception that disruptive innovations are occurring more frequently is based on something real. From 1955 to 1993 the median turnover of the Fortune 500 was 29 companies per year, according to a 2012 Kaufman Foundation report. From 1995 to 2011 the turnover has been 39 companies a year. (The report argues that increased disruption can’t be the only explanation: the higher turnover also reflects the changing methodology of Fortune’s list, which after 1994 included more volatile non-industrial firms, and also more mergers and acquisitions.) A turnover of 39 companies a year means that more than half the companies in the Fortune 500 are replaced every decade, if one includes companies coming on and off the list.
But the preoccupation with disruptive innovations is not only statistical. It is the product of our wonder at the rapid changes forced by technology companies over the last 30 years, and of our understanding of why other companies are unable to innovate. That understanding derives from the research of Clayton Christensen, a professor at Harvard Business School, whose first book, The Innovator’s Dilemma (1997), introduced the phrase “disruptive innovation.” Christensen meant the phrase to suggest an innovation that created a new market and disrupted an existing marketplace, and he provided a grim explanation of why Fortune 500 companies, no matter how efficiently managed, fell off the list: for impeccably rational reasons, their managers were busy satisfying the existing demands of customers instead of imagining how they might satisfy future needs. By definition, satisfying future needs with an entirely new product or service often meant destroying an existing business with its associated revenues, trained employees, production facilities, and supply chains. Creative destruction, Christensen taught, was always easier for a startup with nothing to lose than for an established firm.
We don’t believe that disruptive innovation occurs only in small companies (and, in fairness, Christensen described how established firms might become innovative in a 2003 sequel, The Innovator’s Solution). This annual issue of MIT Technology Review, dedicated to listing the 50 most disruptive companies in the world, celebrates two sorts of disruptive innovators. The first are the startups whose breakthroughs will overthrow the market dominance of larger companies: they include the thermostat maker Nest and Ambri, a maker of grid-scale batteries. The second sort of disruptive innovators are established firms willing to deconstruct their own businesses, because they recognize Christensen’s grim logic. These companies include Microsoft, whose transformative Windows 8 operating system is reviewed here, and Xerox, whose chief executive, Ursula Burns, is interviewed here.
Remaining one of our 50 Disruptive Companies is even less sure a thing than remaining a Fortune 500 company. Brian Bergstein, MIT Technology Review’s deputy editor, notes in the introduction to the package, “The pace of technological change is brutal … Only 15 of these 50 companies were also here last year.” (It’s actually only 14 if you don’t count Nicira, a company on last year’s list that was acquired by VMware.) But write to me at firstname.lastname@example.org and tell me what you think.
UPDATE: An earlier version of this column had “comparative” advantage rather than “competitive advantage.” Nations have comparative advantage; companies possess competitive advantage.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.