Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Despite the fuzziness about exactly what Moore’s Law states, its gist is indisputable: Computer prices have fallen even as computer capabilities have risen. At first glance, this is unsurprising. Although digital gurus often herald the advent of better products at lower costs as an unprecedented boon, it is in fact an economic commonplace. A car from 1906, which by today’s standards is barely functional, then cost the equivalent of $52,640, according to a study by Daniel Raff of the Wharton School of Business and Manuel Trajtenberg of Tel Aviv University. Nonetheless, the digital gurus have a point. The improvements in computer chips have been unprecedentedly rapid-“manna from heaven,” in the phrase of Erik Brynjolfson, an economist at MIT’s Sloan School of Management. “It’s this lucky combination of geometry and physics and engineering,” he says. “The technical innovation is normal, but the rate at which it is occurring is highly unusual.”

Drawn by rapidly improving products at rapidly falling prices, U.S. spending on computers has risen for the last twenty years at an average annual clip of 24 percent-a Moore’s Law of its own. In 1999, U.S. companies spent $220 billion on computer hardware and peripherals, more than they invested in factories, vehicles or any other kind of durable equipment. Computers became so ubiquitous and powerful that it became commonplace to hear the claim that the nation was in the middle of a “digital revolution.” Moore’s Law, the pundits claim, has created a “new economy.”

Maybe so, but for a number of years the evidence didn’t seem to be there. Like everyone else, economists had been discovering the wonders of the inexpensive beige boxes now on their desks. They kept waiting to see the rewards of computing pop up in the government statistics on income, profits and productivity. But it didn’t happen. Throughout the 1980s and the first part of the 1990s the huge national investment in digital technology seemed to have almost no payoff; Moore’s Law ended up boosting profits for chip-makers, but hardly anyone else. “We see the computer age everywhere except in the productivity statistics,” the Nobel Prize-winning MIT economist Robert M. Solow remarked in 1987.

The puzzle-huge expenditures with little apparent benefit-became known as the “productivity paradox.” Not only were these new technical wonders not useful, some researchers argued, they might actually be harmful. Since 1980 the service industries alone have spent more than a trillion dollars on computer hardware and software. Yet Stephen S. Roach, chief economist of Morgan Stanley, suggested in 1991 that this had merely transformed the service sector from an industry characterized by variable labor costs to one that was increasingly dominated by fixed hardware costs. The least productive “portion of the economy,” Roach argued, “[is] the most heavily endowed with high-tech capital”-the more computers, in other words, the less value.

“Look at hotel checkouts,” says Lester Thurow, one of Brynjolfson’s colleagues at Sloan. “They’re completely computerized now, but nobody seems to be doing anything faster. The same thing at the supermarket-you wait in line just as long as you used to wait.” To Thurow, the service sector, which is almost three-quarters of the economy, “seems at first glance to have swallowed vast amounts of computing power without a trace.”

“Nobody could understand it,” says Hal Varian, an economist at the School of Information Management Systems at the University of California, Berkeley (see “What Are the Rules, Anyway?” TR March/April 1999). “On the face of it, the statistics coming out of the government were saying that this massive investment was senseless. In the past, technological innovation has almost invariably increased living standards-look at electricity, railroads, telephones, antibiotics. And here was Moore’s Law-innovation of unprecedented rapidity-that seemed to create nothing for human welfare. But if computers had so little payoff, why was everyone rushing to buy the damned things?”

1 comment. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »