A year ago, Harvard Business Review published a now infamous article called “IT Doesn’t Matter.” Its author, the magazine’s then executive editor Nicholas G. Carr, argued that information technology no longer gives businesses a competitive edge. Carr called information technology managers impatient, wasteful, passive, and lured by the chorus of hype about the so-called strategic value of IT.
Harvard Business Review has 243,000 extremely influential readers. So if it publishes an article saying that information technology doesn’t matter, then an awful lot of important business leaders are going to believe it. And if they do, they’ll run their companies-and our economy-into a ditch.
Many commentators have debunked Carr’s article since it appeared last year. So many in fact that I feel like Elizabeth Taylor’s ninth husband: I know what to do, but how to make it interesting? But Carr’s article just won’t stay debunked. And now he has expanded his thesis into a new book called Does IT Matter?, which the Harvard Business School Press published in April. The question-style title hints at some backpedaling, but Carr’s point is basically unchanged-and it needs debunking yet again.
Since I do not subscribe to the ink-on-dead-trees version of the magazine, I bought my copy of Carr’s May 2003 paper through Amazon.com. It was delivered over the Internet in minutes as a PDF file for $7.00. Carr’s new book is also listed on Amazon.com, a triumph of IT-enabled corporate strategy. We see that IT apparently matters to Harvard.
Carr himself has a website, nicholasgcarr.com. IT apparently matters to Carr.
Let’s face it: IT matters to everyone.
Two Trillion Reasons that I.T. Matters
I asked how much IT matters of Frank Gens, senior vice president for the information technology market research giant IDC. (Full disclosure: IDC is owned by IDG, on whose board I serve.) IDC reports that the global investment in information technology (including telecommunications) totaled $1.9 trillion in 2003 and, despite Carr, will climb to $2.0 trillion in 2004.
According to a 2003 IDC survey, non-IT business executives spend 20 percent of their time thinking about IT. Are they wasting their time? Again despite Carr, almost 60 percent say that the strategic importance of IT is increasing; only 2 percent say the importance is decreasing. Carr may claim these Harvard-MBA-type executives are foolish or misguided, but 55 percent feel that their companies should use information technology more aggressively; 43 percent feel their usage is just right; and only 2 percent feel that they should be less aggressive.
In Carr’s world, information technology managers are apparently fools, or even frauds, to the tune of $2 trillion per year. Presumably, these managers slavishly upgrade to whatever new thing vendors want to sell. But in the real world, millions of people already work hard to spend their IT budgets wisely. The computer-trade press has been covering this complicated process for almost 40 years.
In warding off his debunkers, Carr has offered some clarifications of his argument. He doesn’t really mean that information technology doesn’t matter; rather, he says, his point is that because IT has been commoditized, like electricity, it confers upon its business users no competitive advantage. He also clarifies that he does not mean that information itself doesn’t matter, nor does he mean that the people using the technology don’t matter. What really doesn’t matter, he says, is the no-longer-proprietary technology infrastructure for storing, processing, and transmitting information. So we can only hope that most of Harvard Business Review’s captains of industry read beyond the article titles before dropping the magazine on their coffee tables.
Carr concludes that since information technology no longer provides a competitive advantage to businesses, they should stop spending wildly on advanced information technology products and services. He admonishes managers to stop being suckers for the latest cool products from Cisco, Intel, Microsoft, Oracle, et al. IT managers should stop squandering corporate assets and begin acting in the best interests of their shareholders. They should become boring minimizers of IT cost and risk.
As evidence, Carr points out that my 30-year-old baby, Ethernet, has been standardized and commoditized. It’s true that last year more than 184 million new Ethernet ports were shipped, at a value of $12.5 billion, and that anyone can buy them. Most of those ports are the current mainstream version of Ethernet, which carries data over wires on local-area networks at 10 or 100 megabits per second.
But now that the post-Internet-bubble nuclear winter is almost over, Ethernet is speeding up, to beyond 1,000 megabits (one gigabit) per second. Ethernet is going into wide-area networks. It’s going wireless. It’s going into embedded systems-the eight billion microprocessors shipped every year that don’t go into PCs.
New Ethernet standards are being created, new commoditization races are being started, and Ethernet, if ever it wasn’t, is once again a tool of corporate strategy. In the article and now again in his book, Carr wrongly equates today’s information technologies with electricity, and then he wrongly characterizes electricity as static. In short, Carr, deep into a post-bubble depression, wrongly declares the end of history.
The history of electricity is not over, however. Controlling electrical power grids is still famously problematic, and that’s to say nothing of the exciting developments in technologies such as wind, solar, fission, fusion, hydrogen, and batteries, all of which present strategic opportunities. And information technology is bigger and more recent than electricity. Both are still rapidly evolving; both are very much alive as important elements of corporate strategy.
Much of the research on information technology usage that Carr cites is of dubious validity. Take, for example, the studies that, as Carr puts it, “consistently show” that expenditure on IT as a fraction of company revenue is inversely correlated with financial performance. One study that Carr cites states that the 25 companies with the highest economic returns spent on average just .8 percent of revenues on IT, while the typical company spent 3.7 percent. But this hardly proves Carr’s conclusion. Rather, it indicates that companies investing wisely in IT increase revenues much faster than those that invest unwisely, too little, or not at all. Companies that invest poorly in IT don’t increase revenues as quickly, so their IT expenditures are higher as a fraction of revenue. Companies that invest unwisely in IT go out of business and are not counted in the studies. IT still matters.
Raining On The I.T.-Bashers’ Parade
Carr is not the first person to question the value of information technology. Paul Strassman, for example, despite being a high-profile, big-budget chief information officer for such organizations as NASA, the U.S. Department of Defense, and Xerox, has made a second career of studies not finding the benefits of IT. Morgan Stanley economist Stephen Roach is another famous critic of IT. During the 1990s, he claimed that increasing investments in information technology were showing no benefits. Roach, echoing MIT economist Robert Solow, wrote that IT investments were not appearing in U.S. productivity numbers. I called Solow, a Nobel Prize winner, and he admitted that this so-called productivity paradox might easily be explained by how poorly productivity is measured. Productivity numbers are hard to come by, and Roach relied on outmoded methods. But Roach stuck by his IT-doesn’t-matter numbers, like the proverbial drunk looking for his wallet under a street lamp.
Today, information technology accounts for about half of capital expenditures by U.S. companies. Productivity is high and increasing rapidly. What is Roach saying now? He says that the productivity numbers are highly questionable. In other words, if the data conflict with your theory, throw out the data. It makes me wonder whether Roach, like Carr, just has a bad attitude about IT.
In Carr’s reply to early critics, published on the Web by the Harvard Business Review in June 2003, he wrote that his article “has at least succeeded in setting off an important and long-overdue debate about the role of information technology in business.” I don’t think so. If anything, Carr has succeeded only in misleading his readers.
Howard Smith and Peter Fingar, in their 2003 book IT Doesn’t Matter-Business Processes Do, argue that Carr is not only wrong but dangerous. They remind us of what happened when Harvard Business Review published Michael Hammer’s 1990 article “Reengineering Work.” Too many Harvard MBAs decided to take the easy part of Hammer’s advice and downsized their companies to death. Unless Carr’s argument is debunked, the current crop of reigning MBAs will be tempted to run WordPerfect on mid-1980s PCs connected to IBM 360 mainframes.
Which brings us to Carr’s central conceit. He urges IT managers not to venture foolishly out onto technology’s cutting edge and to buy only that which has low risk and high value to their companies. Carr urges this as if it were breaking news.
In fact, IDG alone publishes 300 information technology magazines worldwide, and each has several competitors. All of these have been offering advice for decades on just how far onto the bleeding edge of technology it is wise to go to give your company an edge. Taking technology risks, when done well, can bring competitive advantage. When done poorly, it can bring disaster. But that’s a balancing act that the information technology managers of the world were well aware of long before Carr put in his two cents.
We often brag about the marvelous U.S. innovation machine. We brag about our world-leading research universities. We brag about our entrepreneurs and the venture capitalists, like me, who back them. But there is an unsung player in our marvelous innovation machine: the aggressive users of information technology. In Germany, by contrast, it’s hard to buy IT unless it’s from Siemens. In the United States, startups readily find managers out on the cutting edge, searching for new, smarter, and more efficient ways to do things-a quest that keeps our vaunted innovation machine humming.
If business executives follow Carr’s advice, who will provide innovation’s test beds? How will new technologies find their markets? This may be the most important reason to debunk Carr’s arguments once and for all: if they harden into conventional business wisdom, American ingenuity will be strangled in its bassinet.
I serve on the board of a small public company in Silicon Valley called Avistar. For 10 years, Avistar has been marketing networked desktop videoconferencing to large companies. Avistar’s hardware and software have worked increasingly well for a long time. What’s taking time is their adoption-the search for one situation after another in which the technologies provide a value that’s worth the risk.
Avistar CEO Jerry Burnett disagrees strongly with Carr and recommends a division of labor in IT management. On one hand are specialists in what Burnett calls “availability management.” These might be mistaken for the cost and risk minimizers that Carr extols. On the other hand are specialists in “adoption management.” These are the people Carr wants demotivated, demoted, or fired.
Carr argues that things that are widely available, like IT, cannot be used for sustained competitive advantage. Well, since Harvard Business Review is received by almost a quarter-million people and can be bought by anyone with $16.95, then according to Carr’s own argument, that publication itself doesn’t matter. Cancel your subscription and download any interesting articles from back issues-which any teenager will be able to find for you on the Internet for free.
The gene-edited pig heart given to a dying patient was infected with a pig virus
The first transplant of a genetically-modified pig heart into a human may have ended prematurely because of a well-known—and avoidable—risk.
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.