COVID data is in a rough state, making it difficult for decision-makers to understand conditions on the ground. The CDC’s data management issues, due in part to unclear definitions and the myriad local and state health agencies involved, led one expert to talk recently about federal data standards to clean up (or at least standardize) the mess.

From Datanami:

Analytics expert Tom Davenport co-authored an August 2020 MIT Sloan Management Review story about the shoddy state of COVID data. The authors wrote: “One is forced to conclude that the data needed to manage the COVID-19 pandemic is effectively unmanaged. This is an acute problem, demanding urgent, professional attention.”

[…]

While the CDC theoretically had plans for collecting and assessing data with state and local health authorities at a more integrated level, that apparently never came to pass, according to Davenport . “Cleary, if we’re going to put the CDC in charge of fighting pandemics in the US, then I think we’re going to need to have some clear data standards and processes,” he said.

But there are headwinds, starting with the CDC, which historically has not been that oriented toward data, he said. “They’ve done a good job of fighting pandemic in other countries, Ebola and so on,” he said. “But we haven’t had any pandemic in the US, so I guess it’s not surprising that we had a bad approach to data management for it.”

What’s needed, he said, are a clear set of federal data standards for relevant events, including what constitutes a case of COVID and what constitutes a death, and so forth. This would minimize the double-counting and other errors that not only give policy makers an incorrect view into the record, but also undermine confidence in public health authorities. This policy could be enforced with a law that says independent state and local health departments that want federal dollars to help fight a pandemic would have to abide by the federal standards, he said.