Why Isn’t America Innovating Like It Used To?
America isn’t innovating like it used to. And by “like it used to,” I mean the period from after World War II to 1973, when an explosion of new technologies increased worker productivity at a pace that, had it continued to the present day, would mean an increase in the average worker’s wage of 51 percent, or $18 per hour. (This difference is represented by the gray area in the graph, above.)
That’s just one of the surprising (at least to me) long-term trends explained in a new report from The Brookings Institution, A Dozen Economic Facts About Innovation, which delves into everything from the reasons for wage stagnation among middle-income men to the effects of innovation on life expectancy. (I’ll be delving into more of the report’s findings starting next week, and linking the results into a series.)
But back to the graph: What’s it measuring? From the report:
The economic growth not accounted for by known factors such as increases in the number of machines, labor, or a more educated workforce is called total factor productivity (TFP) growth. TFP thus measures any advancement that has been made in how to use existing resources to increase overall output and incomes.
Clearly, something powerful was going on in the post-war years that began to peter out by the 1970’s. The Brookings report offers no explanation. It would be easy to hypothesize about what’s going on here – the 1970’s marked the peak of the availability of cheap energy, or Americans did not pursue education at a rate sufficient to capitalize on existing growth in innovation, or we simply gobbled up all the low-hanging fruit in terms of improvements in machinery and business processes.
Not being an economist, I can only conjecture. But here is one data point from the field I cover most often, information technology. Many of us probably take it for granted that information technology makes us more productive, yet this intuition is extremely controversial, and in fact much of the research on the relationship between IT and productivity has historically found no correlation, or even a negative correlation.
I’m not saying that our investment in computers and communications equipment was a misdirection of resources that actually hurt the rate at which we improved our productivity. But I can’t help but wonder – again, just speculating – if one contributing factor in this trend was the Kafkaesque world of cubicles and knowledge work which IT made widespread. Don’t movies like Brazil and Office Space parody a workplace, once the sole purview of Bartleby and his fellow scriveners, that put a dent in the resolve of the average American worker to perform to his or her highest ability?
Is it possible, in short, that even as it has made our jobs physically easier, technology has, by eliminating the human element from our labors, simply made us enjoy work less?
Of course, I realize that this is hardly an original observation.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.