Skip to Content
Policy

Why tech didn’t save us from covid-19

America's paralysis reveals a deep and fundamental flaw in how the nation thinks about innovation.
June 17, 2020
Selman Design

Technology has failed the US and much of the rest of the world in its most important role: keeping us alive and healthy. As I write this, more than 380,000 are dead, the global economy is in ruins, and the covid-19 pandemic is still raging. In an age of artificial intelligence, genomic medicine, and self-driving cars, our most effective response to the outbreak has been mass quarantines, a public health technique borrowed from the Middle Ages. 

Nowhere was the technology failure more obvious than in testing. Standard tests for diseases like covid-19 use polymerase chain reaction (PCR), a more than 30-year-old chemistry technique routinely used in labs around the world. Yet although scientists identified and sequenced the new coronavirus within weeks of its appearance in late December—an essential step in creating a diagnostic—the US and other countries stumbled in developing PCR tests for general use. Incompetence and a sclerotic bureaucracy at the US Centers for Disease Control meant the agency created a test that didn’t work and then insisted for weeks that it was the only one that could be used. 

Meanwhile, the six-inch nasopharyngeal swabs needed to reach far up a person’s nose to collect samples for PCR testing were in short supply, as were the chemical reagents necessary to process the samples. In the critical early weeks when the coronavirus could still have been contained, many Americans, even those seriously ill, couldn’t get tested for the deadly virus. Even four months into the pandemic, the US still isn’t equipped to do the massive and frequent screening needed to safely end a general lockdown. 

Combined with the lack of testing, a splintered and neglected system of collecting public health data meant epidemiologists and hospitals knew too little about the spread of the infection. In an age of big data in which companies like Google and Amazon use all sorts of personal information for their advertising and shopping operations, health authorities were making decisions blind. 

It wasn’t only the lack of testing and data that doomed so many people, of course. There weren’t enough ventilators or protective masks, nor factories to make them.  “The pandemic has shone a bright light on just how much US manufacturing capabilities have moved offshore,” says Erica Fuchs, a manufacturing expert at Carnegie Mellon University. 

Why couldn’t the US’s dominant tech industry and large biomedical sector provide these things? It’s tempting to simply blame the Trump administration’s inaction. Rebecca Henderson, an economist and management expert at Harvard, points to a long history of the US government’s directing industry and innovation during crises. Many companies, she says, were waiting for the administration to mobilize the effort and guide priorities. “I kept thinking, ‘Let’s focus the US thoughtfulness on testing and we’ll get this.’ I kept waiting for it to happen,” she says. But it never did: “There is simply a vacuum.” 

But Henderson and other experts who study innovation point to a problem deeper than the lack of government intervention. A once-healthy innovation ecosystem in the US, capable of identifying and creating technologies essential to the country’s welfare, has been eroding for decades.

Any country’s capacity to invent and then deploy the technologies it needs is shaped by public funding and government policies. In the US, public investment in manufacturing, new materials, and vaccines and diagnostics has not been a priority, and there is almost no system of government direction, financial backing, or technical support for many critically important new technologies. Without it, the country was caught flat-footed. 

Instead, as Henderson writes in her book Reimagining Capitalism, the US has, over the last half-century, increasingly put its faith in free markets to create innovation. That approach has built a wealthy Silicon Valley and giant tech firms that are the envy of entrepreneurs around the world. But it has meant little investment and support for critical areas such as manufacturing and infrastructure—technologies relevant to the country’s most basic needs.

Though written before covid-19 emerged, Henderson’s book was published in mid-April, as the pandemic was surging in many parts of the US. In it, she describes the role business can play in tackling big problems like climate change and inequality, but she also documents decades of government failure to support  the private sector in doing so. Today, she says, it feels as though she’s “living the book.” 

The US’s paralysis in the face of covid-19 matters not only because it has already doomed tens of thousands to an early death and crippled the largest economy in the world, but because it reveals a deep and fundamental flaw in how the nation thinks about innovation. 

Building stuff we need

Economists like to measure the impact of innovation in terms of productivity growth, particularly “total factor productivity”—the ability to get more output from the same inputs (such as labor and capital). Productivity growth is what makes advanced nations richer and more prosperous over the long run. For the US as well as most other rich countries, this measure of innovation has been dismal for nearly two decades. 

There are a lot of different ideas about why the innovation slowdown happened. Perhaps the kinds of inventions that previously transformed the economy—like computers and the internet, or before that the internal-combustion engine—stopped coming along. Or perhaps we just haven’t yet learned how to use the newest technologies, like artificial intelligence, to improve productivity in many sectors. But one likely factor is that governments in many countries have significantly cut investments in technology since the 1980s. 

Government-funded R&D in the US, says John Van Reenen, an economist at MIT, has dropped from 1.8% of GDP in the mid-1960s, when it was at its peak, to 0.7% now (chart 1). Governments tend to fund high-risk research that companies can’t afford, and it’s out of such research that radical new technologies often arise. 

The problem with letting private investment alone drive innovation is that the money is skewed toward the most lucrative markets. The biggest practical uses of AI have been to optimize things like web search, ad targeting, speech and face recognition, and retail sales. Pharmaceutical research has largely targeted the search for new blockbuster drugs. Vaccines and diagnostic testing, so desperately needed now, are less lucrative. More government money might have boosted those pursuits. 

Nor is it enough to invent new technologies: public support is also vital for helping companies adopt them. That’s especially true in large, slow-moving sectors of the economy such as health care and manufacturing—precisely where the country’s crippled capabilities have been most evident during the pandemic. 

In a widely circulated blog post, internet pioneer and Silicon Valley icon Marc Andreessen decried the US’s inability to “build” and produce needed supplies like masks, claiming that “we chose not to have the mechanisms, the factories, the systems to make these things.” The accusation resonated with many: the US,  where manufacturing has deteriorated, seemed unable to churn out things like masks and ventilators, while countries with strong and  innovative manufacturing sectors, such as China, Japan, Taiwan, and Germany, have fared far better. 

But Andreessen is wrong to portray the unwillingness to build as a deliberate choice. And the country’s ability to make stuff isn’t something that can be quickly revved up. The decline of US manufacturing has been caused by years of financial market pressures, government indifference, and competition from low-wage economies. 


Where did all the money go?

By Tate Ryan-Mosley

US federal funding for R&D has fallen. That’s one cause of sluggish productivity growth.

Federal funding has been dropping for decades.

Ratio of U.S. R&D as % of GDP

The US is lagging behind South Korea, Japan, and Germany, and China is catching up.

R&D as % of GDP in 2000 vs. 2017

Spending on basic research has been nearly flat.

$ million


Total factor productivity (TFP) is sluggish.

Growth rate, %

And manufacturing TFP has collapsed.

Growth rate, %


In the US, manufacturing jobs dropped by almost a third between 2000 and 2010 and have barely recovered since. Manufacturing productivity has been particularly poor in recent years (chart 5). What has been lost is not only jobs but also the knowledge embedded in a strong manufacturing base, and with it the ability to create new products and find advanced and flexible ways of making them. Over the years, the country ceded to China and other countries the expertise in competitively making many things, including solar panels and advanced batteries—and, it now turns out, swabs and diagnostic tests too. 

No country should aim to make everything, says Fuchs, but “the US needs to develop the capacity to identify the technologies—as well as the physical and human resources—that are critical for national, economic, and health security, and to invest strategically in those technologies and assets.”  

Regardless of where products are made, Fuchs says, manufacturers need more coordination and flexibility in global supply chains, in part so they aren’t tied to a few sources of production. That quickly became evident in the pandemic; for example, US mask makers scrambled to procure the limited supply of melt-blown fiber required to make the N95 masks that protect against the virus. 

The problem was made worse because manufacturers keep inventories razor-thin to save money, often relying on timely shipments from a sole provider. “The great lesson from the pandemic,” says Suzanne Berger, a political scientist at MIT and an expert on advanced manufacturing, is “how we traded resilience for low-cost and just-in-time production.” 

Berger says the government should encourage a more flexible manufacturing sector and support domestic production by investing in workforce training, basic and applied research, and facilities like the advanced manufacturing institutes that were created in the early 2010s to provide companies with access to the latest production technologies. “We need to support manufacturing not only [to make] critical products like masks and respirators but to recognize that the connection between manufacturing and innovation is critical for productivity growth and, out of increases in productivity, for economic growth,” she says.

The good news is that the US has had this discussion during previous crises. The playbook exists.

Declaring war on the virus

In June 1940, Vannevar Bush, then the director of the Carnegie Institution for Science in Washington, DC, went to the White House to meet President Franklin D. Roosevelt. The war was under way in Europe, and Roosevelt knew the US would soon be drawn into it. As Simon Johnson and Jonathan Gruber, both economists at MIT, write in their recent book Jump-Starting America, the country was woefully unprepared, barely able to make a tank. 

Bush presented the president with a plan to gear up the war effort, led by scientists and engineers. That gave rise to the National Defense Research Committee (NDRC); during the war, Bush directed some 30,000 people, including 6,000 scientists, to steer the country’s technological development. 

The inventions that resulted are well known, from radar to the atomic bomb. But as Johnson and Gruber write, the investment in science and engineering continued well after the war ended. “The major—and now mostly forgotten—lesson of the post-1945 period is that modern private enterprise proves much more effective when government provides strong underlying support for basic and applied science and for the commercialization of the resulting innovations,” they write. 

A similar push to ramp up government investment in science and technology “is clearly what we need now,” says Johnson. It could have immediate payoffs both in technologies crucial to handling the current crisis, such as tests and vaccines, and in new jobs and economic revival. Many of the jobs created will be for scientists, Johnson acknowledges, but many will also go to trained technicians and others whose work is needed to build and maintain an enlarged scientific infrastructure.

This matters especially, he says, because with an administration that is pulling back from globalization and with consumer spending weak, innovation will be one of the few options for driving economic growth. “Scientific investment needs to be a strategic priority again,” says Johnson. “We’ve lost that. It has become a residual. That’s got to stop.” 

Johnson is not alone. In the middle of May, a bipartisan group of congressmen proposed what they called the Endless Frontier Act to expand funding for “the discovery, creation, and commercialization of technology fields of the future.” They argued that the US was “inadequately prepared” for covid-19 and that the pandemic “exposed the consequences of a long-term failure” to invest in scientific research. The legislators called for $100 billion over five years to support a “technology directorate” that would fund AI, robotics, automation, advanced manufacturing,  and other critical technologies. 

Around the same time, a pair of economists, Northwestern’s Ben Jones and MIT’s Pierre Azoulay, published an article in Science calling for a massive government-led “Pandemic R&D Program” to fund and coordinate work in everything from vaccines to materials science. The potential economic and health benefits are so large, Jones argues, that even huge investments to accelerate vaccine development and other technologies will pay for themselves. 

Vannevar Bush’s approach during the war tells us it’s possible, though the funding needs to be substantial, says Jones. But increased funding is just part of what is required, he says. The initiative will need a central authority like Bush’s NDRC to identify a varied portfolio of new technologies to support—a function that is missing from current efforts to tackle covid-19. 

The thing to note about all these proposals is that they are aimed at both short- and long-term problems: they are calling for an immediate ramp-up of public investment in technology, but also for a bigger government role in guiding the direction of technologists’ work. The key will be to spend at least some of the cash in the gigantic US fiscal stimulus bills not just on juicing the economy but on reviving innovation in neglected sectors like advanced manufacturing and boosting the development of promising areas like AI. “We’re going to be spending a great deal of money, so can we use this in a productive way? Without diminishing the enormous suffering that has happened, can we use this as a wake-up call?” asks Harvard’s Henderson.

“Historically, it has been done a bunch of times,” she says. Besides the World War II effort, examples include Sematech, the 1980s consortium that revived the ailing US semiconductor industry in the face of Japan’s increasing dominance, by sharing technological innovations and boosting  investment in the sector. 

Can we do it again? Henderson says she is “hopeful, though not necessarily optimistic.”

The test of the country’s innovation system will be whether over the coming months it can invent vaccines, treatments, and tests, and then produce them at the massive scale needed to defeat covid-19. “The problem hasn’t gone away,” says CMU’s Fuchs. “The global pandemic will be a fact of life—the next 15 months, 30 months—and offers an incredible opportunity for us to rethink the resiliency of our supply chains, our domestic manufacturing capacity, and the innovation around it.” 

It will also take some rethinking of how the US uses AI and other new technologies to address urgent problems. But for that to happen, the government has to take on a leading role in directing innovation to meet the public’s most pressing needs. That doesn’t sound like the government the US has now. 

Deep Dive

Policy

Is there anything more fascinating than a hidden world?

Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.

Africa’s push to regulate AI starts now        

AI is expanding across the continent and new policies are taking shape. But poor digital infrastructure and regulatory bottlenecks could slow adoption.

Yes, remote learning can work for preschoolers

The largest-ever humanitarian intervention in early childhood education shows that remote learning can produce results comparable to a year of in-person teaching.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.