How can the number of deaths go down?
Why do case and death counts sometimes change so drastically?
Earlier today, the World Health Organization reported 30 deaths associated with the unidentified outbreak in the Democratic Republic of Congo. Just yesterday, the Africa Centers for Disease Control put the number of deaths at 79. On Tuesday, Reuters estimated it was 143. How can the number of deaths go down?
While I don't have insight into this specific outbreak, large fluctuations in counts are common in the early days of outbreaks. Epidemiological investigations and case definitions often explain these shifts.
A case definition specifies the precise symptoms, epidemiological history (e.g., contact with a certain animal) or test results required for an illness or death to be included in the outbreak tally. As the investigation unfolds, the criteria for inclusion may change, new information may be used to refine the case definition, or previously reported cases might be reclassified or removed from the tally.
Consider this thought experiment. Suppose you are an epidemiologist charged with investigating reports of a febrile (fever) illness in a classroom. You begin by asking the 30 students whether they have recently had a cough. Half raise their hands.
After considering this for a moment, you get more specific. "In the last 7 days," you say, "how many of you had a fever of at least 101 degrees AND either a cough or a sore throat?" This is your case definition. Now the number of hands drops to five.
You recommend diagnostic testing for those five students. Three are found to have influenza A, one has adenovirus, and another has rhinovirus. Those last two are no longer part of the outbreak count, because they have a different diagnosis. Thanks to the case definition, your number of confirmed cases is 3, down from 15. And you have identified the cause behind the outbreak โ influenza.
Outbreak counts can also balloon in misleading ways. Suppose you are now called to the hospital to investigate an unusual cluster of five patients with acute respiratory distress. The patients are young and otherwise healthy, and the attending clinician is worried that something scary is happening. Testing reveals that all five patients have SARS, a severe coronavirus with pandemic potential. This is terrible news.
Your public health team races to investigate. You begin by searching for more cases, this time among the healthcare workers and family members who had contact with the hospitalized patients. By the next day, you have found 18 that meet the case definition. Thus, the case count has jumped from five to 23, not because the outbreak has truly quadrupled in size from one day to the next, but because your investigation has turned up more cases.
There is (at least) one other way that mismatched numbers can complicate outbreak reporting. Some case definitions have multiple levels of certainty, depending on how sure the investigator is that the case is โreal.โ A presumed case may have the relevant symptoms and history but lack test results, while a confirmed case is laboratory-confirmed. News reports may differ in what they are reporting, with some citing presumed+confirmed combined, while some report only confirmed cases.
Thus in outbreak reporting, fluctuating numbers often reflect evolving investigations, refined case definitions, or varying reporting standardsโnot necessarily suspicious inconsistencies or errors. And when accounting hiccups do occur, epidemiologists typically correct the record and document how and why errors occurred. This process ensures that the final data tells an accurate story, even if the investigation has twists and turns along the way.
If you enjoyed this newsletter, you will likely enjoy my new book Crisis Averted: The Hidden Science of Fighting Outbreaks.
Great article as usual Dr Caitlin. It is so easy to get conspiracy theories started when people do not understand the measurement system and the associated criteria for counting!! Thank you!
In my mind, this is why I always as about the data sources. There's a difference between facts and just figures. I want to be sure I'm not comparing apples to oranges, and many reports don't give enough detail to know that.....