Humanist. Agnostic. Atheist. I always stumble a bit when asked about how I view the world, generally preferring to appear matter-of-fact than antagonistic. Perhaps the best moniker to capture my desired view of the world is “skeptic.”
What makes a skeptic? An outlook on the world based on the scientific method: the accumulation of empirical evidence and the continuous testing, retesting, rejection, and modification of hypotheses in light of that evidence. Scientists are human, too, and may be led astray by their own biases and quirks, but science is a long-term progressive process aimed at pruning away these false starts and tangents and leaving a coherent, predictive interpretation on matters of fact.
More skepticism would make our society better, I believe. As readers of this blog know, one of my major irritations is the sway that pseudoscience continues to hold. I am a staunch believer in freedom of conscience, so I don’t much care, for example, if people choose to be theists. What I mind is the use of superstition as a substitute for facts and rationality in areas where it matters, like public policy. You want to believe God created the universe in seven days? Fine by me. You want to pass that off as science in the schools? Unacceptable.
It was with delight, then, that I just finished reading Michael Shermer’s Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time. Shermer is the director of the Skeptics Society, a group dedicated precisely to debunking anti- and pseudo-scientific thinking, particularly in the public sphere (see this TED talk, for example). His book is a good analysis of the distinctions between science and pseudoscience and brief overview of some of the major battle lines in this front of the culture wars: the paranormal, near-death experiences, alien encounters, witch crazes, cults, creationism, and Holocaust denial.
In Chapter 3, Shermer talks about “how thinking goes wrong” for both scientists and lay folk alike, and how legitimate science attempts to self-correct in the face of these errors. Here’s his list, with my own annotations:
- Theory influences observations
- The observer changes the observed
- Equipment constructs results
- Anecdotes do not make a science. And yet so many people cite them as the bais for their beliefs.
- Scientific language does not make a science. This is a major pet peeve of mine, people invoking the uncertainty principle and wave-particle duality to justify all sorts of mumbo-jumbo.
- Bold statements do not make claims true
- Heresy does not equal correctness
- Burden of proof [is on the outsider seeking to overturn the accepted and proven scientific paradigm]
- Rumors do not equal reality
- Unexplained is no inexplicable. This is what really gets me about so-called Intelligent Design: I can’t think of how this complex structure could possibly have evolved, therefore it couldn’t have evolved, therefore there’s a designer.
- Failures are rationalized. Failures advance science, but tend to be ignored in pseudoscience.
- After the fact reasoning. *Post hoc, ergo propter hoc”
- Coincidence. “Synchronicity” my ass. It’s simple probability.
- Representativeness. The human tendency to remember hits and ignore misses keeps psychic hotlines in business.
- Emotive words and false analogies
- Ad ignorantiam. “If you can’t disprove it, it is proven”— the complete opposite of science. Applies to God , psychics, etc.
- Ad hominem and tu quoque
- Hasty generalization
- Overreliance on authorities
- Either-or. “If your theory is wrong, then mine must be right.” Is it supported by facts? What about alternatives?
- Circular reasoning
- Reductio as absurdum and slippery slope
- Effort inadequacies and the need for certainty, control, and simplicity. Critical thinking requires training and work.
- Problem-solving inadequacies. We tend to seek supporting rather than contrary evidence for our views.
- Ideological immunity, or the Planck problem. As the old school dies out, new practitioners are better able to evaluate and embrace what were once revolutionary ideas.
[Shermer also has a chapter at the end on “Why smart people believe weird things,” where he cites intellectual attribution bias (“my choices are rational, but your choices are swayed by irrational beliefs”) and confirmation bias (“I block out contrary evidence.”)]
I highly recommend this book as a skeptic’s manifesto and a reminder that we are not alone in fighting and bemoaning the ignorance around us. I will leave you with an inspirational quote from chapter 15 attributed to Alfred Kinsey (of sex research fame). This is a quote that I found a bit tangential to the points in the book, but which I like for the (perhaps unjustified) hope that with more skepticism perhaps we can make society more supportive of human dignity and differences:
Prescriptions are merely public confessions of prescriptionists. What is right for one individual may be wrong for the next; and what is sin and abomination to one may be a worthwhile part of the next individual’s life. The range of individual variation, in any particular case, is usually much greater than what is generally understood…. And yet social forms and moral codes are prescribed as though all individuals were identical; and we pass judgements, make awards, and heap penalties without regard to the diverse difficulties involved when such different people face uniform demands.