Skip to Content

How IT Costs More Jobs than It Creates

A new book challenges the standard view that technological advances are always good for employment.
October 25, 2011

Recent advances in information technologies may be driving people out of work and enriching the already rich, a new book argues. The book challenges the long-held view that new technology displaces workers in the short term but always creates more jobs in the long term.

Erik Brynjolfsson, director of the Center for Digital Business at MIT’s Sloan School of Management, and Andrew McAfee, a principal research scientist at the center, cowrote the new e-book, Race Against the Machine: How the Digital Revolution Is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy, which is released today.

Brynjolfsson and McAfee argue that technology seems to be doing three things simultaneously: enabling CEOs and other leaders in some fields to earn outsize incomes; replacing people with software in certain kinds of service jobs; and—as factories automate at a faster pace—benefiting owners at the expense of workers.

Some 60 percent of the wealth created in the United States between 2002 and 2007 went to the top 1 percent of Americans. This is not merely the result of financial deregulation or favorable Bush-era tax breaks; information technology has enabled the far-broader sale of digital goods and expansion of software-aided management, the pair argues.

“Technology lets superstars—whether Mark Zuckerberg or Lady Gaga or a hedge fund manger—leverage their skills and talents across far more assets and customers than they could have done previously,” Brynjolfsson says. “You can distribute bits—costlessly, globally, instantly—in ways you can’t distribute atoms. Anything that is digital, from software to music, can reach a much broader global audience. This is also true for the business processes that you embed in software. CEOs and others are leveraging that.”

This dynamic could help explain how the economy and productivity can grow while jobs shrink. That’s exactly what happened during the first decade of the 2000s, in a striking departure from the previous six decades, each of which saw double-digit job growth. As McAfee put it at Technology Review’s EmTech 2011 conference last week: “Technology grows the overall economic pie, but that’s different than saying it will leave everyone unambiguously better off.”

Lost decade: Despite economic growth, the United States lost jobs in the first decade of the 21st century, in a striking departure from the previous six decades.

Software has directly put some people out of work, as anyone who has made an airline reservation or sought directory assistance over the telephone may have noticed in recent years. McAfee has argued previously that other jobs—such as certain kinds of document examination once done by armies of lawyers—can now be done competently by scanning technologies and software. Intelligent assistants and question-answer software—of which IBM’s Watson is one example—may accelerate the trend, the two men argue. And while entrepreneurship in the United States has not diminished, “what’s changed is you can have a startup that employs fewer people than it did a decade ago,” McAfee said last week.

As for the third trend—the rise of robotic automation—Brynjolfsson points to the case of Foxconn, the global electronics manufacturer, which plans to replace many of its factory workers in China with a million new robots. “That means more income goes to capital, and less income goes to labor,” he says.

Robert Solow, who won the Nobel Prize in 1987 for his macroeconomic research on economic growth, including on the role of technology, says that “advances in technology always throw people out of work,” but that “the economic history so far is that aggregate employment—and employment at rising wages—has not suffered.” 

He says that while he did not have a reason to disagree with the book’s ideas, it was too soon to say whether the past decade might be different. He notes that some of the technologies described in the book, like driverless cars, might give “the casual reader some exaggerated notion of how close we are to enormous changes in the way we live, and enormous increases in labor productivity” that would throw even more people out of work. 

And Solow, who is 87, notes that the aspirations of billions of poor people in other parts of the world will create ample room for continued economic growth—and employment—around the globe. “Some of that will be done by those large low-wage populations, but it offers plenty of opportunities for higher-productivity, higher-wage workers in the rich countries to export,” he says.

The United States and the rest of the world have, of course, lived through profound technological and employment shifts in the past. As Byrnjolfsson points out, about 90 percent of Americans worked on farms in 1800, but by 1900, that number was only 41 percent, thanks partly to technology and partly to the opening of more-fertile farmland in the Midwest. Yet people adapted and new jobs emerged. “We have always had to redeploy and reinvent,” he says, “but now it’s happening so fast that people aren’t keeping up.”

Solow agrees with the book’s conclusion that one near-term policy prescription is to develop more-relevant job training. “The other question not raised in the book—but raised by the content, and this has been discussed by economists off and on for years,” Solow says, “is how to make an economy that will deal with a situation in which an enormous amount of labor becomes superfluous, in which almost all the work is done by robots, including the manufacture of robots.”

“There you have to begin thinking about how you support a population. One way to do it, of course, would be democratization of capital. If all the income is being earned, in effect, by capital—by machines of one kind or another—then the economy becomes a kind of mutual fund, a situation in which the ownership of all that capital is spread around the population. That is a century away, or two centuries away, or maybe never,” he says.

For now, Brynjolfsson and McAfee cite evidence that—in addition to other macroeconomic problems, and the 2008 financial crisis—the U.S. economy is undergoing a structural change wrought by technology. “It’s not just the crash, it’s something that is changing fundamentally in the way people use technology,” Brynjolfsson says. And referring to the ongoing demonstrations on Wall Street and elsewhere, he adds: “It helps bring together why it is that these people are so upset.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.