“Baloney Detection” in the era of “fake news”

In attempting to help my students (and extended family) recognize these categories more responsibly — preferably before they share them — I think it’s useful to remember Carl Sagan’s chapter on “The Fine Art of Baloney Detection” from 1996.

By Kristopher A. Nelson in

Twitter Facebook

Much has been written recently about the explosion of so-called “fake news” — much of which is, unlike real fake news such as that generated by the Onion or Andy Borowitz — falls somewhere on the spectrum between “snake-oil” pieces that generate income via advertising (“clickbait”), to shrill scare pieces about the evil doings (e.g., vaccines cause autism), to full-out political propaganda by politicians or their supporters.

In attempting to help my students (and extended family) recognize these categories more responsibly — preferably before they share them — I think it’s useful to remember Carl Sagan’s chapter on “The Fine Art of Baloney Detection” from 1996. His beginning point included the following “tools,” paraphrased below (I have bolded key elements that I think are particularly a problem in 2017):

  • Look for independent confirmation of the “facts
  • Encourage substantive debate on the evidence by knowledgable people
  • Arguments from authority carry little weight — but experts are not just about authority alone and deserve consideration
  • Consider multiple hypotheses do not get overly attached to any of them (i.e., be careful of confirmation bias)
  • Quantify, if possible — qualitative truths are also real, but are much more challenging (but worthwhile) to asses
  • “Occam’s Razor” — in general, go with the simpler hypothesis as it is more likely to be right
  • Ask if the hypothesis can be falsified — that is, can one seek evidence, at least in theory, to disprove something?
  • Experimentation, evidence, and data collection are key to seeking truth

Sagan also runs through the more common logical fallacies, such as:

  • Attacking the arguer and not the argument (ad hominem)
  • Arguing from authority alone
  • Arguing that, because bad things might/will happen if something is true, that therefore it cannot be true
  • Claiming that what has not been proven false must be true
  • Appealing to unknowable truths (like, “God moves in mysterious ways”)
  • Assuming the answer/begging the question, which typically involves a confusion between causation and correlation
  • Selective observation, also known as “cherry picking” — just ignore facts that contradict your belief, or put yourself in a “bubble” that only supports your views
  • Using small numbers to make big claims, or otherwise torturing statistics to get what you want
  • General inconsistency of approach, like investing huge amounts to combat certain dangers (shoe bombs) while ignoring others
  • Assuming a result follows when it does not, either logically or possibly because of a lack of evidence
  • Confusing correlation with causation — just because on thing comes before another does not mean it caused it (but it often suggests further research)
  • Asking meaningless questions that are only interesting rhetorically, but have no real-world existence
  • Considering only two, often extreme positions — often part of a “straw-man” argument or a false dichotomy
  • Focusing on short-term ends without considering long-term consequences (consider global climate change)
  • Slippery slope — if we do this thing, all this other stuff will happen next (think of same-sex marriage)
  • Straw-man arguments, otherwise known as inventing an opponent and giving them absurd positions just so you can “win” an argument
  • Suppressing contrary evidence or telling only half the truth
  • Dancing around issues with rhetorical tricks and language (“think of the children!”)

There are many, many more, and all deserve more examples, particularly in today’s media climate. The whole book is worth reading