Stop Saying Robots Are Destroying Jobs—They Aren’t
MIT Technology Review editor David Rotman recently wrote an article called “How Technology is Destroying Jobs.” The title not only sums up the article’s thesis, it sums up the view of many pundits seeking to explain lackluster job growth. But technology never has destroyed jobs on a net basis and it won’t in the future.
The article focuses on MIT scholars Erik Brynjolfsson and Andrew McAfee, authors of the widely cited book Race Against the Machine. For them, workers are “losing the race against the machine, a fact reflected in today’s employment statistics.” They go on to argue that, “As we head … into the period where continuing exponential increases in computing power yield astonishing results—we expect that economic disruptions will only grow as well.”
Brynjolfsson and McAfee rely on a key data point to make their case: “The pattern is clear: as businesses generated more value from their workers, the country as a whole became richer, which fueled more economic activity and created even more jobs. Then, beginning in 2000, the lines diverge; productivity continues to rise robustly, but employment suddenly wilts. By 2011, a significant gap appears between the two lines, showing economic growth with no parallel increase in job creation.”
But the reality is that there is no logical relationship between job growth and productivity. To see why, imagine two nations with annual productivity growth of around 2 percent. Nation A has a declining workforce because more people are retiring than are getting to prime working age. Nation B has a growing workforce because of higher fertility rates workers and immigration. As this example of real nations shows (Japan as nation A and the U.S. as nation B), an economy can have high productivity and low or high employment growth. The reason why job growth slowed after 2000 was largely demographic. The number of adults in the workforce (employed and unemployed) grew 18 percent in the 80s, 13 percent in the 90s but just 8 percent in the 2000s as baby boomers got older and women’s entrance into the workforce peaked.
The data are just as clear on the lack of a relationship between productivity and unemployment. If “robots” really are the cause of today’s sluggish job growth, then productivity growth should be higher since 2008 than before. In fact, from 2008 to 2012 productivity growth was only 1.8 percent while from 2000 to 2008 productivity grew 2.6 percent while we had close to full employment.
Brynjolfsson and McAfee’s mistake comes from considering only first order effects of automation where the machine replaces the worker. But when a machine replaces a worker, there is a second order effect: the organization using the machine saves money and that money flows back into to the economy either through lower prices, higher wages for the remaining workers, or higher profits. In all three cases that money gets spent which stimulates demand that other companies respond to by hiring more workers.
This common sense view is borne out virtually all economic studies looking at the relationship between productivity and jobs. While some studies have found that productivity growth does have some short-term negative job impacts, all the studies find either no impacts or positive impacts on total jobs in the longer term. As the OECD stated in a definitive review of the studies on productivity and employment:
Historically, the income-generating effects of new technologies have proved more powerful than the labor-displacing effects: technological progress has been accompanied not only by higher output and productivity, but also by higher overall employment.
Sure, but those that argue that robots kill jobs argue that this time it’s different. As the article states, “Technologies like the Web, artificial intelligence, big data, and improved analytics—all made possible by the ever increasing availability of cheap computing power and storage capacity—are automating many routine tasks.”
But there are two problems with this argument. First, it assumes that productivity growth rates will increase significantly. But there is little evidence that the United States will see productivity growth in excess of 3 percent a year (the best we have ever done). This is in part because despite IT advances that boost productivity in information-based functions, a growing share of jobs involve interacting with people (e.g., nursing homes, police and fire) or doing physical tasks that are difficult to automate (e.g., construction, janitorial services).
But even if I am wrong, and I hope I am, that the rate of productivity miraculously increases to over 5 percent a year, it still doesn’t matter for jobs. For that would mean that national income increases 5 percent a year and we would all buy more restaurant meals, vacations, cars, houses, therapeutic massages, college educations, and 3-D TVs. And workers have to work to produce these goods and services. And if these are somehow automated, then we have even more money to spend and will buy other goods and services, creating jobs in these functions.
In sum, the worries of machines overtaking humans are as old as machines themselves. Pitting man against machine only stokes antipathy toward technology and could have a chilling effect on the innovation and adoption of technology essential to grow our economy. This is the last thing our economy and workers need. As my coauthor Stephen J. Ezell and I argue in Innovation Economics: The Race for Global Advantage, far from being doomed by an excess of technology, we are actually at risk of being held back by too little technology.
So, Andrew and Erik, if you really believe robots are taking our jobs let’s make a Long Bet. I will bet that by 2023 we will have at least 5 percent more jobs in the United States than we do today. You in?
Robert D. Atkinson is the president of the Information Technology & Innovation Foundation, a think tank based in Washington, D.C.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.