Skip to Content

The productivity paradox

Why brilliant AI technologies are not leading to widespread growth and prosperity.
June 18, 2018

To become wealthier, a country needs strong growth in productivity—the output of goods or services from given inputs of labor and capital. For most people, in theory at least, higher productivity means the expectation of rising wages and abundant job opportunities.

Productivity growth in most of the world’s rich countries has been dismal since around 2004. Especially vexing is the sluggish pace of what economists call total factor productivity—the part that accounts for the contributions of innovation and technology. In a time of Facebook, smartphones, self-driving cars, and computers that can beat a person at just about any board game, how can the key economic measure of technological progress be so pathetic? Economists have tagged this the “productivity paradox.”

Some argue that it’s because today’s technologies are not nearly as impressive as we think. The leading proponent of that view, Northwestern University economist Robert Gordon, contends that compared with breakthroughs like indoor plumbing and the electric motor, today’s advances are small and of limited economic benefit. Others think productivity is in fact increasing but we simply don’t know how to measure things like the value delivered by Google and Facebook, particularly when many of the benefits are “free.”

Both views probably misconstrue what is actually going on. It’s likely that many new technologies are used to simply replace workers and not to create new tasks and occupations. What’s more, the technologies that could have the most impact are not widely used. Driverless vehicles, for instance, are still not on most roads. Robots are rather dumb and remain rare outside manufacturing. And AI is mysterious for most companies.

We’ve seen this before. In 1987 MIT economist Robert Solow, who won that year’s Nobel Prize for defining the role of innovation in economic growth, quipped to the New York Times that “you can see the computer age everywhere but in the productivity statistics.” But within a few years that had changed as productivity climbed throughout the mid and late 1990s.

What’s happening now may be a “replay of the late ’80s,” says Erik Brynjolfsson, another MIT economist. Breakthroughs in machine learning and image recognition are “eye-popping”; the delay in implementing them only reflects how much change that will entail. “It means swapping in AI and rethinking your business, and it might mean whole new business models,” he says.

In this view, AI is what economic historians consider a “general-purpose technology.” These are inventions like the steam engine, electricity, and the ­internal-combustion engine. Eventually they transformed how we lived and worked. But businesses had to be reinvented, and other complementary technologies had to be created to exploit the breakthroughs. That took decades.

Illustrating the potential of AI as a general-purpose technology, Scott Stern of MIT’s Sloan School of Management describes it as a “method for a new method of invention.” An AI algorithm can comb through vast amounts of data, finding hidden patterns and predicting possibilities for, say, a better drug or a material for more efficient solar cells. It has, he says, “the potential to transform how we do innovation.”

But he also warns against expecting such a change to show up in macroeconomic measurements anytime soon. “If I tell you we’re having an innovation explosion, check back with me in 2050 and I’ll show you the impacts,” he says. General-purpose technologies, he adds, “take a lifetime to reorganize around.”

Even as these technologies appear, huge gains in productivity aren’t guaranteed, says John Van Reenen, a British economist at Sloan. Europe, he says, missed out on the dramatic 1990s productivity boost from the IT revolution, largely because European companies, unlike US-based ones, lacked the flexibility to adapt.

Keep Reading

Most Popular

Geoffrey Hinton tells us why he’s now scared of the tech he helped build

“I have suddenly switched my views on whether these things are going to be more intelligent than us.”

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

Meet the people who use Notion to plan their whole lives

The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.

Learning to code isn’t enough

Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.