Skip to Content
Opinion

Covid-19 data is a public good. The US government must start treating it like one.

The US has failed to prioritize a highly effective and economical intervention—providing quick and easy access to coronavirus data.
search for health information
Ms Tech

Earlier this week as a pandemic raged across the United States, residents were cut off from the only publicly available source of aggregated data on the nation’s intensive care and hospital bed capacity. When the Trump administration stripped the Centers for Disease Control and Prevention (CDC) of control over coronavirus data, it also took that information away from the public.

I run a nonpartisan project called covidexitstrategy.org, which tracks how well states are fighting this virus. Our team is made up of public health and crisis experts with previous experience in the Trump and Obama administrations. We grade states on such critical measures as disease spread, hospital load, and the robustness of their testing. 

Why does this work matter? In a crisis, data informs good decision-making. Along with businesses, federal, state, and local public health officials and other agencies rely on us to help them decide which interventions to deploy and when workplaces and public spaces can safely reopen. Almost a million people have used our dashboards, with thousands coming back more than 200 times each.

To create our dashboards, we rely on multiple sources. One is the National Healthcare Safety Network (NHSN), run by the CDC. Prior to July 14, hospitals reported the utilization and availability of intensive care and inpatient beds to the NHSN. This information, updated three times a week, was the only publicly available source of aggregated state-level hospital capacity data in the US.

With 31 states currently reporting increases in the number of hospitalized covid-19 patients, these utilization rates show how well their health systems will handle the surge of cases.

Having this information in real time is essential; the administration said the CDC’s system was insufficiently responsive and data collection needed to be streamlined. The US Department of Health and Human Services (HHS) directed hospitals (pdf) to report their data to a new system called HHS Protect.

Unfortunately, by redirecting hospitals to a new system, it left everyone else in the dark. On July 14, the CDC removed the most recent data from its website. As we made our nightly update, we found it was missing. After significant public pressure, the existing maps and data are back—but the agency has added a disclaimer that the data will not be updated going forward. 

This is unacceptable. This critical indicator was being shared multiple times a week, and now updates have been halted. US residents need a federal commitment that this data will continue to be refreshed and shared.

The public is being told that a lot of effort is going into the new system. An HHS spokesman told CNBC that the new database will deliver “more powerful insights” on the coronavirus. But the switch has rightly been criticized because this new data source is not yet available to the public. Our concerns are amplified by the fact that responsibility for the data has shifted from a known entity in the CDC to a new, as-yet-unnamed team within HHS.

I was part of the team that helped fix Healthcare.gov after the failed launch in 2013. One thing I learned was that the people who make their careers in the federal government—and especially those working at the center of a crisis—are almost universally well intentioned. They seek to do the right thing for the public they serve.

In the same spirit, and to build trust with the American people, this is an opportunity for HHS to make the same data it’s sharing with federal and state agencies available to the public. The system that HHS is using helps inform the vital work of the White House Coronavirus Task Force. From leaked documents, we know that reports for the task force are painstakingly detailed. They include county-level maps, indicators on testing robustness, and specific recommendations. All of this information belongs in the public domain.

This is also an opportunity for HHS to make this data machine readable and thereby more accessible to data scientists and data journalists. The Open Government Data Act, signed into law by President Trump, treats data as a strategic asset and makes it open by default. This act builds upon the Open Data Executive Order, which recognized that the data sets collected by the government are paid for by taxpayers and must be made available to them. 

As a country, the United States has lagged behind in so many dimensions of response to this crisis, from the availability of PPE to testing to statewide mask orders. Its treatment of data has lagged as well. On March 7, as this crisis was unfolding, there was no national testing data. Alexis Madrigal, Jeff Hammerbacher, and a group of volunteers started the COVID Tracking Project to aggregate coronavirus information from all 50 state websites into a single Google spreadsheet. For two months, until the CDC began to share data through its own dashboard, this volunteer project was the sole national public source of information on cases and testing. 

With more than 150 volunteers contributing to the effort, the COVID Tracking Project sets the bar for how to treat data as an asset. I serve on the advisory board and am awed by what this group has accomplished. With daily updates, an API, and multiple download formats, they’ve made their data extraordinarily useful. Where the CDC’s data is cited 30 times in Google Scholar and approximately 10,000 times in Google search results, the COVID Tracking Project data is cited 299 times in Google Scholar and roughly 2 million times in Google search results.

Sharing reliable data is one of the most economical and effective interventions the United States has to confront this pandemic. With the Coronavirus Task Force daily briefings a thing of the past, it’s more necessary than ever for all covid-related data to be shared with the public. The effort required to defeat the pandemic is not just a federal response. It is a federal, state, local, and community response. Everyone needs to work from the same trusted source of facts about the situation on the ground. Data is not a partisan affair or a bureaucratic preserve. It is a public trust—and a public resource.

Ryan Panchadsaram is a cofounder of covidexitstrategy.org and United States Digital Response. He currently works at Kleiner Perkins and was formerly the deputy chief technology officer for the United States.

Deep Dive

Policy

Is there anything more fascinating than a hidden world?

Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.

Africa’s push to regulate AI starts now        

AI is expanding across the continent and new policies are taking shape. But poor digital infrastructure and regulatory bottlenecks could slow adoption.

Yes, remote learning can work for preschoolers

The largest-ever humanitarian intervention in early childhood education shows that remote learning can produce results comparable to a year of in-person teaching.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.