Why People Believe Weird Things

Humanist. Agnostic. Atheist. I always stumble a bit when asked about how I view the world, generally preferring to appear matter-of-fact than antagonistic. Perhaps the best moniker to capture my desired view of the world is “skeptic.”

What makes a skeptic? An outlook on the world based on the scientific method: the accumulation of empirical evidence and the continuous testing, retesting, rejection, and modification of hypotheses in light of that evidence. Scientists are human, too, and may be led astray by their own biases and quirks, but science is a long-term progressive process aimed at pruning away these false starts and tangents and leaving a coherent, predictive interpretation on matters of fact.

More skepticism would make our society better, I believe. As readers of this blog know, one of my major irritations is the sway that pseudoscience continues to hold. I am a staunch believer in freedom of conscience, so I don’t much care, for example, if people choose to be theists. What I mind is the use of superstition as a substitute for facts and rationality in areas where it matters, like public policy. You want to believe God created the universe in seven days? Fine by me. You want to pass that off as science in the schools? Unacceptable.

It was with delight, then, that I just finished reading Michael Shermer’s Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time. Shermer is the director of the Skeptics Society, a group dedicated precisely to debunking anti- and pseudo-scientific thinking, particularly in the public sphere (see this TED talk, for example). His book is a good analysis of the distinctions between science and pseudoscience and brief overview of some of the major battle lines in this front of the culture wars: the paranormal, near-death experiences, alien encounters, witch crazes, cults, creationism, and Holocaust denial.

In Chapter 3, Shermer talks about “how thinking goes wrong” for both scientists and lay folk alike, and how legitimate science attempts to self-correct in the face of these errors. Here’s his list, with my own annotations:

  1. Theory influences observations
  2. The observer changes the observed
  3. Equipment constructs results
  4. Anecdotes do not make a science. And yet so many people cite them as the bais for their beliefs.
  5. Scientific language does not make a science. This is a major pet peeve of mine, people invoking the uncertainty principle and wave-particle duality to justify all sorts of mumbo-jumbo.
  6. Bold statements do not make claims true
  7. Heresy does not equal correctness
  8. Burden of proof [is on the outsider seeking to overturn the accepted and proven scientific paradigm]
  9. Rumors do not equal reality
  10. Unexplained is no inexplicable. This is what really gets me about so-called Intelligent Design: I can’t think of how this complex structure could possibly have evolved, therefore it couldn’t have evolved, therefore there’s a designer.
  11. Failures are rationalized. Failures advance science, but tend to be ignored in pseudoscience.
  12. After the fact reasoning. *Post hoc, ergo propter hoc”
  13. Coincidence. “Synchronicity” my ass. It’s simple probability.
  14. Representativeness. The human tendency to remember hits and ignore misses keeps psychic hotlines in business.
  15. Emotive words and false analogies
  16. Ad ignorantiam. “If you can’t disprove it, it is proven”— the complete opposite of science. Applies to God , psychics, etc.
  17. Ad hominem and tu quoque
  18. Hasty generalization
  19. Overreliance on authorities
  20. Either-or. “If your theory is wrong, then mine must be right.” Is it supported by facts? What about alternatives?
  21. Circular reasoning
  22. Reductio as absurdum and slippery slope
  23. Effort inadequacies and the need for certainty, control, and simplicity. Critical thinking requires training and work.
  24. Problem-solving inadequacies. We tend to seek supporting rather than contrary evidence for our views.
  25. Ideological immunity, or the Planck problem. As the old school dies out, new practitioners are better able to evaluate and embrace what were once revolutionary ideas.

[Shermer also has a chapter at the end on "Why smart people believe weird things," where he cites intellectual attribution bias ("my choices are rational, but your choices are swayed by irrational beliefs") and confirmation bias ("I block out contrary evidence.")]

I highly recommend this book as a skeptic’s manifesto and a reminder that we are not alone in fighting and bemoaning the ignorance around us. I will leave you with an inspirational quote from chapter 15 attributed to Alfred Kinsey (of sex research fame). This is a quote that I found a bit tangential to the points in the book, but which I like for the (perhaps unjustified) hope that with more skepticism perhaps we can make society more supportive of human dignity and differences:

Prescriptions are merely public confessions of prescriptionists. What is right for one individual may be wrong for the next; and what is sin and abomination to one may be a worthwhile part of the next individual’s life. The range of individual variation, in any particular case, is usually much greater than what is generally understood…. And yet social forms and moral codes are prescribed as though all individuals were identical; and we pass judgements, make awards, and heap penalties without regard to the diverse difficulties involved when such different people face uniform demands.