Why Isn’t America Innovating Like It Used To?
America isn’t innovating like it used to. And by “like it used to,” I mean the period from after World War II to 1973, when an explosion of new technologies increased worker productivity at a pace that, had it continued to the present day, would mean an increase in the average worker’s wage of 51 percent, or $18 per hour. (This difference is represented by the gray area in the graph, above.)

That’s just one of the surprising (at least to me) long-term trends explained in a new report from The Brookings Institution, A Dozen Economic Facts About Innovation, which delves into everything from the reasons for wage stagnation among middle-income men to the effects of innovation on life expectancy. (I’ll be delving into more of the report’s findings starting next week, and linking the results into a series.)
But back to the graph: What’s it measuring? From the report:
The economic growth not accounted for by known factors such as increases in the number of machines, labor, or a more educated workforce is called total factor productivity (TFP) growth. TFP thus measures any advancement that has been made in how to use existing resources to increase overall output and incomes.
Clearly, something powerful was going on in the post-war years that began to peter out by the 1970’s. The Brookings report offers no explanation. It would be easy to hypothesize about what’s going on here – the 1970’s marked the peak of the availability of cheap energy, or Americans did not pursue education at a rate sufficient to capitalize on existing growth in innovation, or we simply gobbled up all the low-hanging fruit in terms of improvements in machinery and business processes.
Not being an economist, I can only conjecture. But here is one data point from the field I cover most often, information technology. Many of us probably take it for granted that information technology makes us more productive, yet this intuition is extremely controversial, and in fact much of the research on the relationship between IT and productivity has historically found no correlation, or even a negative correlation.
I’m not saying that our investment in computers and communications equipment was a misdirection of resources that actually hurt the rate at which we improved our productivity. But I can’t help but wonder – again, just speculating – if one contributing factor in this trend was the Kafkaesque world of cubicles and knowledge work which IT made widespread. Don’t movies like Brazil and Office Space parody a workplace, once the sole purview of Bartleby and his fellow scriveners, that put a dent in the resolve of the average American worker to perform to his or her highest ability?
Is it possible, in short, that even as it has made our jobs physically easier, technology has, by eliminating the human element from our labors, simply made us enjoy work less?
Of course, I realize that this is hardly an original observation.
Keep Reading
Most Popular

Toronto wants to kill the smart city forever
The city wants to get right what Sidewalk Labs got so wrong.

Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.

Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.

The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Stay connected

Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.