The Trouble With 'No Evidence'

Monday, December 20, 2021

A recent post by Scott Alexander at Astral Codex Ten makes the following much-needed point about scientific communication that seems head-slappingly obvious ... but only after he makes it after giving us a bunch of real-world examples:

Doesn't that rube know there's 'no evidence' that parachutes prevent deaths? (Image by Ernesto Velázquez, via Unsplash, license.)
... Science communicators are using the same term -- "no evidence" -- to mean:
  1. This thing is super plausible, and honestly very likely true, but we haven't checked yet, so we can't be sure.
  2. We have hard-and-fast evidence that this is false, stop repeating this easily debunked lie.
This is utterly corrosive to anybody trusting science journalism.

Imagine you are John Q. Public. You read "no evidence of human-to-human transmission of coronavirus", and then a month later it turns out such transmission is common. You read "no evidence linking COVID to indoor dining", and a month later your governor has to shut down indoor dining because of all the COVID it causes. You read "no hard evidence new COVID strain is more transmissible", and a month later everything is in panic mode because it was more transmissible after all. And then you read "no evidence that 45,000 people died of vaccine-related complications". Doesn't sound very reassuring, does it?
One common example from the pandemic -- which he briefly takes up -- is face masks, for which there isn't (as far as I know) conclusive scientific evidence one way of the other of efficacy against Covid transmission, but for which there is a strong common-sense case. And yes, Alexander does mention that it does not serve the cause of clarity to make a big deal out of that lack of a particular kind of evidence.

The whole post is well worth a read, and includes my favorite example of the silliness of demanding that everything be proven through a scientifically rigorous study: the claim one can make that there is "no evidence" that using a parachute helps prevent injuries and deaths when jumping out of planes.

-- CAV

2 comments:

SteveD said...

A lot depends upon the level of effort that has been made to procure evidence. If no effort has been made, then it stands to reason there will likely be no (or little) direct evidence. If we've been studying the phenomenon for years and still have no evidence, then we may need to face the fact that we are wrong or that some other factor is in play.
There are also cases where it might not be possible (or ethically possible) to procure direct hard evidence. I would assume this the case for parachutes. We must fall back into making a common sense or logical argument.
There are also levels of certainty. For example, with masks, how would you do a double-blind study (RCT)? In the recent study from Bangladesh, it appears that people wearing masks were also more likely to social distance (a confounding factor). Was the decrease in transmission due to masks or social distancing or both (does it even matter)? Which direction does the cause  effect go? In situations like this, we fall back to making our case indirectly. Your conclusions will have lower certainty. This explains for example why chemistry is a 'harder' science than history. With history it is much harder to remove confounding factors than it is in a chemistry lab.

Gus Van Horn said...

Steve,

Your comment regarding confounding factors for mask studies reminds me of another problem, which plagues the field of nutrition: self-reporting is notoriously inaccurate for a whole host of reasons, making such evidence very suspect. (I don't know if self-reporting was involved in the study you bring up; you just jogged my memory.)

And all this affects what we might or might not have evidence about/how good any of it is -- on top of the difficulties caused by using that phrase with a public that is largely unaware of these problems or the distinction Alexander highlights.

Gus