Data is key to an effective pandemic response — and the lack of it has hobbled the U.S. response again and again. The lack of testing and then, of standardized reporting of cases and deaths left U.S. officials slow to grasp the scale of the crisis when the virus first began to spread. Insufficient data also meant supplies arrived too late in hard-hit cities. State and federal officials made decisions about travel restrictions and reopening policies with an incomplete picture of what was happening.
Many places were forced to shut down before they had substantial outbreaks, former FDA Commissioner Scott Gottlieb told The Washington Post, and when the virus finally arrived, some resisted a return to restrictions.
“Early on, CDC couldn’t even tell us how many people were being hospitalized for covid,” Gottlieb said.
Multiple factors underlie this data deficit. First and foremost: The U.S. does not have a national health system like Israel or the U.K., and in a pandemic, must rely on a vast and decentralized public health infrastructure that is notoriously underfunded and full of holes. As a result, there is no simple way to track infections or outcomes across a wide swath of the population.
Another obstacle to data aggregation may be the siloed computer systems and the self-interest of medical institutions. Some hospital systems want to hang onto their data, said Michael Kurilla, director of the division of clinical innovation at the National Institutes of Health’s National Center for Advancing Translational Sciences.
“They don’t necessarily want to give up all that data because they see that as a potential future revenue stream,” Kurilla said.
The CDC compiles national statistics by collecting data from every state and locality, but these jurisdictions often have different ways of counting tests, infections and even deaths. The data may not be submitted to the CDC for days or weeks. Many smaller jurisdictions still share that data via outdated fax machines.