Why Isn’t America Innovating Like It Used To?
America isn’t innovating like it used to. And by “like it used to,” I mean the period from after World War II to 1973, when an explosion of new technologies increased worker productivity at a pace that, had it continued to the present day, would mean an increase in the average worker’s wage of 51 percent, or $18 per hour. (This difference is represented by the gray area in the graph, above.)

That’s just one of the surprising (at least to me) long-term trends explained in a new report from The Brookings Institution, A Dozen Economic Facts About Innovation, which delves into everything from the reasons for wage stagnation among middle-income men to the effects of innovation on life expectancy. (I’ll be delving into more of the report’s findings starting next week, and linking the results into a series.)
But back to the graph: What’s it measuring? From the report:
The economic growth not accounted for by known factors such as increases in the number of machines, labor, or a more educated workforce is called total factor productivity (TFP) growth. TFP thus measures any advancement that has been made in how to use existing resources to increase overall output and incomes.
Clearly, something powerful was going on in the post-war years that began to peter out by the 1970’s. The Brookings report offers no explanation. It would be easy to hypothesize about what’s going on here – the 1970’s marked the peak of the availability of cheap energy, or Americans did not pursue education at a rate sufficient to capitalize on existing growth in innovation, or we simply gobbled up all the low-hanging fruit in terms of improvements in machinery and business processes.
Not being an economist, I can only conjecture. But here is one data point from the field I cover most often, information technology. Many of us probably take it for granted that information technology makes us more productive, yet this intuition is extremely controversial, and in fact much of the research on the relationship between IT and productivity has historically found no correlation, or even a negative correlation.
I’m not saying that our investment in computers and communications equipment was a misdirection of resources that actually hurt the rate at which we improved our productivity. But I can’t help but wonder – again, just speculating – if one contributing factor in this trend was the Kafkaesque world of cubicles and knowledge work which IT made widespread. Don’t movies like Brazil and Office Space parody a workplace, once the sole purview of Bartleby and his fellow scriveners, that put a dent in the resolve of the average American worker to perform to his or her highest ability?
Is it possible, in short, that even as it has made our jobs physically easier, technology has, by eliminating the human element from our labors, simply made us enjoy work less?
Of course, I realize that this is hardly an original observation.
Keep Reading
Most Popular
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.