America isn’t innovating like it used to. And by “like it used to,” I mean the period from after World War II to 1973, when an explosion of new technologies increased worker productivity at a pace that, had it continued to the present day, would mean an increase in the average worker’s wage of 51 percent, or $18 per hour. (This difference is represented by the gray area in the graph, above.)
That’s just one of the surprising (at least to me) long-term trends explained in a new report from The Brookings Institution, A Dozen Economic Facts About Innovation, which delves into everything from the reasons for wage stagnation among middle-income men to the effects of innovation on life expectancy. (I’ll be delving into more of the report’s findings starting next week, and linking the results into a series.)
But back to the graph: What’s it measuring? From the report:
The economic growth not accounted for by known factors such as increases in the number of machines, labor, or a more educated workforce is called total factor productivity (TFP) growth. TFP thus measures any advancement that has been made in how to use existing resources to increase overall output and incomes.
Clearly, something powerful was going on in the post-war years that began to peter out by the 1970’s. The Brookings report offers no explanation. It would be easy to hypothesize about what’s going on here – the 1970’s marked the peak of the availability of cheap energy, or Americans did not pursue education at a rate sufficient to capitalize on existing growth in innovation, or we simply gobbled up all the low-hanging fruit in terms of improvements in machinery and business processes.
Not being an economist, I can only conjecture. But here is one data point from the field I cover most often, information technology. Many of us probably take it for granted that information technology makes us more productive, yet this intuition is extremely controversial, and in fact much of the research on the relationship between IT and productivity has historically found no correlation, or even a negative correlation.
I’m not saying that our investment in computers and communications equipment was a misdirection of resources that actually hurt the rate at which we improved our productivity. But I can’t help but wonder – again, just speculating – if one contributing factor in this trend was the Kafkaesque world of cubicles and knowledge work which IT made widespread. Don’t movies like Brazil and Office Space parody a workplace, once the sole purview of Bartleby and his fellow scriveners, that put a dent in the resolve of the average American worker to perform to his or her highest ability?
Is it possible, in short, that even as it has made our jobs physically easier, technology has, by eliminating the human element from our labors, simply made us enjoy work less?
Of course, I realize that this is hardly an original observation.
Here’s how a Twitter engineer says it will break in the coming weeks
One insider says the company’s current staffing isn’t able to sustain the platform.
Technology that lets us “speak” to our dead relatives has arrived. Are we ready?
Digital clones of the people we love could forever change how we grieve.
How to befriend a crow
I watched a bunch of crows on TikTok and now I'm trying to connect with some local birds.
Starlink signals can be reverse-engineered to work like GPS—whether SpaceX likes it or not
Elon said no thanks to using his mega-constellation for navigation. Researchers went ahead anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.