Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

In the end, as the story of the emperor’s new clothes reminds us, somebody has to break the spell. In May 2003, ­Nicholas Carr cast himself in the naysayer’s role by publishing an article titled “IT Doesn’t Matter” in the Harvard Business Review. In 2004 he followed that with a book, Does IT Matter? Information Technology and the Corrosion of Competitive Advantage. Thereby, he aroused the ire of the good and the great in Silicon Valley and Redmond, WA.

For that, he won a little fame. Now he has a new book, The Big Switch: Rewiring the World, from Edison to Google, which will almost certainly influence a large audience. Carr persuasively argues that we’re moving from the era of the personal computer to an age of utility computing–by which he means the expansion of grid computing, the distribution of computing and storage over the Internet, until it accounts for the bulk of what the human race does digitally. And he nicely marshals his historical analogies, detailing how electricity delivered over a grid supplanted the various power sources used during most of the 19th century. Many readers may find his conclusions unconvincingly dark. I think he could have borne in mind the old joke: predicting is hard, especially about the future. That said, I also suspect he’s right to suggest that in a decade or so, many things we now believe permanent will have disappeared.

Given that Carr’s conclusions are controversial, it’s helpful to trace his thesis in full. In “IT Doesn’t Matter,” he argued that as industries mature, the products or services they supply become commodities that compete on price alone. The information tech­nology industry, he continued, had arrived at that phase: for most companies that did not themselves develop and sell IT, information technology offered no competitive advantage and was just another cost of doing business. It wasn’t hard to find evidence for Carr’s contention. A business school truism since Clayton Christensen’s 1997 book The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail is that you can tell a sector has been commodified when competition has created a “performance oversupply,” where almost any product differentiation is unwanted. And indeed, by sometime before the 20th century’s end, the vast majority of PCs had far more processing and storage ­capacity than their users needed for the most common tasks: e-mail, Web browsing, word processing. In fact, Carr pointed out, 70 percent of a typical Windows network’s storage ­capacity went unused.

By 2000, Carr claimed, close to 50 percent of American companies’ annual capital expenditures went to IT: every year, U.S. businesses acquired more than 100 million new PCs. The biggest IT-associated business risk that companies faced, he concluded, was overspending. It was time for businesses to “explore cheaper solutions, including open-source applications and bare-bones network PCs,” he argued. “If a company needs evidence of the kind of money that might be saved, it need only look at Microsoft’s profit margin.”

4 comments. Share your thoughts »

Credit: John Todd/Sun Microsystems/Getty Images

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me