Who you gonna believe, me or you own eyes?

Mrs. Teasdale: Your Excellency, I thought you’d left!

Chicolini: Oh no, I no leave.

Mrs. Teasdale: But I saw you with my own eyes!

Chicolini: Well, who you gonna believe, me or your own eyes?

Duck Soup. Funniest movie ever.

If I could choose a super power, it would be neither flight nor invisibility, but the ability, like Triad, to separate into multiple people so I could accomplish more. I find that my multiple personality disorder is not all that efficient at getting things done. The Goth cowgirl? Lazy.

So sometimes I have to cut corners. As this post goes live I am at TAM helping with panel discussions and workshops and the only way I can get a post up is to cannibalize my lecture. Dr. Gorski will not let me post the slides and be done with it; those managing editors can be so unreasonable. Full sentences. Proper spelling. Good grammar. Sheesh. Some people.

The topic of my presentation is the cognitive errors that lead people to believe in nonsense and is, or was, a brief tour of the flawed ways in which we think and how the brain allows everyone to be under the false impression that fictions are real.

In the old days I simplistically thought people were just stupid, uninformed, or both. With 45% of Americans believing in faith healing, 37% in astrology, 30% in UFOs and 25% in reincarnation (if true I am sure the fates will bring me back as a rabbit in a syphilis lab), it was just that people are ignorant dumb-asses, right? Give people the facts and they will realize that they are wrong an alter their opinion accordingly. Right? Nope.

I have met the ignorant dumb-ass and he was me.

Pogo. Sort of.

Most folks are neither stupid nor dumb-asses.  Critical thinking is not default mode of the brain and most of the time, for most people, critical thinking is a waste of time. It is for me. For the activities of daily living there is little need to think critically. We rely on our experience and the experience of others to decide what to do. It is often an invaluable short cut. I want to eat out, I check out Yelp. I want new music, I ask my kids (except the hip-hop. Tats, hip-hop and square glasses are some of the styles that confirms I am old. I don’t understand the aesthetic of any of them). I read many of the reviews on Amazon before ordering a product. And I never bought a car because it was highly rated on consumer reports. I get the car that elicits a frisson of want and I have enjoyed every car I have owned.

The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

— Richard Feynman

It is fun to quote Feynman, but for day to day life it rarely applies.

And then you get to health care. Life and death, sickness and health. And for probably the first time the paradigm by which everyone interprets the world, experience and the advice of others, is no longer applicable or reliable for patient or doctor. It is no wonder that people trust the anecdotes and narratives of SCAM users and providers. They are using the methods most of us use to evaluate the world.

I like to say the three most dangerous words in medicine are ‘I lack insurance’. No. That’s not it. It is the words ‘In my experience.’ But experience dominates over critical thinking every day in every way. If I liked to participate in the naturalistic fallacy, I would say critical thinking is un-natural, like plastic, cement and Twinkies (I suspect all are made in the same factory from the same material), a man-made construct never found in the wild.

Feynman was a genius; most of us are not. W was closer to the human condition:

There’s an old saying in Tennessee — I know it’s in Texas, probably in Tennessee — that says, fool me once, shame on — shame on you. Fool me — you can’t get fooled again.

The lists on Wikipedia of cognitive biases, logical fallacies, and memory biases are sobering. There are so many ways to think poorly it is remarkable we get anything done. Everyone has their favorite fallacies. I like:

  • Focusing effect: the tendency to place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
  • Confirmation bias: the tendency to search for or interpret information in a way that confirms one’s preconceptions.
  • Illusory correlation: inaccurately perceiving a relationship between two events, either because of prejudice or selective processing of information.
  • Clustering illusion: the tendency to see patterns where actually none exist.

These are arguably the most important fallacies that allow people to see efficacy in nonsense. SCAMs would have difficulty existing without them.

I am terrible at recognizing logical fallacies in realtime. (I always fail when they do Name That Logical Fallacy on SGU. Or is that conformation bias?) When I do note fallacies in others, I discover that people do not take my observation that they are thinking poorly with grace and gratitude.

Go figure. People do not appreciate having their intellectual flaws identified. It is like telling them they have no sense of humor or are a lousy driver. It is one of the reasons rational opinions are ignored. No one likes a know-it-all and we all are aware of what ultimately happened to Mr. Know-It-All.

Not only do people not like having their intellectual shortcomings noticed, they probably are unable to recognize the fact that they are not excellent thinkers. So much of life is explained by the Dunning-Kruger effect:

The Dunning–Kruger effect is a cognitive bias in which unskilled people make poor decisions and reach erroneous conclusions, but their incompetence denies them the metacognitive ability to recognize their mistakes.

The unskilled therefore suffer from illusory superiority, rating their ability as above average, much higher than it actually is, while the highly skilled underrate their own abilities, suffering from illusory inferiority.

Actual competence may weaken self-confidence, as competent individuals may falsely assume that others have an equivalent understanding. As Kruger and Dunning conclude, “the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others.

The effect is about paradoxical defects in cognitive ability, both in oneself and as one compares oneself to others.

Ever see a second year surgical resident treat a bacteremic Staphylococcus aureus infection with clindamycin? Orally? And underdosed at that. I have. Many times. They have a culture and susceptibility, what more is needed to treat an infection? They have no clue that they do not know a burro from a burrow when it comes to treating infections. Kind of scary.

Combine Dunning-Kruger with the Peter Principle and history is explained far better than Das Kapital or The Foundation Trilogy.

And then there is memory. It is remarkable how flawed our memories are. In my ignorant youth I had though the memory of my life was like Super 8 film, or, perhaps for you youngsters, a youtube video: a perfect recording of events. One of my intellectual epiphanies was the book The Seven Sins of Memory by Daniel Schacter. I had no idea just how unreliable my memory is and how much of my past is a constructed narrative.

The Sins

Sin 1: Memory fades

After a month 75% of the memory of an event fades. Except the lyrics of songs from high school. Those ARE forever. Most recollections of past events are reconstructions based on current expectations and knowledge: people remember the past not how it was but how they think it should have been.

Sin 2: Misattribution

We remember events that never happened or attribute events to people and things that were not there, or recall what happened but it occurs at the wrong time and place. One of the many reasons “anecdotal evidence” of therapeutic efficacy is suspect.

Sin 3: Memory is suggestible

More than one-third of subjects recalled being hugged by Bugs Bunny at Disneyland. Impossible because Bugs is not a Disney character – after a researcher planted the false memory.

If you don’t accurately remember whether an SCAM worked, and you think it should, and someone tells you the SCAM worked, you will remember that it was indeed effective.

Sin 4: Memory is biased

I call it the Gigi effect. The whole thank heaven for little girls thing is kind of creepy now, but we have all had, shall we call it a discussion, with our significant after a social event and we both remember the events under disagreement in ways that make us look noble and the other suspect. And if you think a SCAM is the next best thing to champagne, then you will remember the intervention fondly.

Sin 5: Memory has persistence

Especially when associated with stressors. Medical training has left me with TNTC memories associated with a wee bit of PTSD. I still have not-ready-for-the-test dreams. Intense experiences imprint memory and give them disproportionate importance later. I can be far more biased by failures and complications than any good outcome.

Health issues are major stressors, so the flawed memories are going to have a disproportionate impact.

To be complete, 6 and 7 are absentmindedness and blocking, but are not germane to this discussion, but I know if I did not include them there would be HE-doublehockeysticks to pay. Our readers are a fastidious lot.

It all comes together in the archetype, at least for research, in N-Rays. I love the story of N-rays because it is in the hardest of the hard sciences, physics. Blondot was a French physicist who thought he saw changes in the brightness of an electric spark that he thought as due to a new form of radiation, naming it the N-ray. 120 other scientists had over 300 published articles claiming to be able to detect N-rays emanating from most substances.

Most researchers at the time used the perceived light of a dim phosphorescent surface as “detectors”, although work in the period clearly showed the change in brightness to be a physiological phenomenon rather than some actual change in the level of illumination (Wikipedia).

It was suspected at the time that Blondot was seeing things that were not actually there since the observations made no sense given the understanding of the times. A killjoy physicist, Robert Wood, visited the lab, surreptitiously sabotaged the N-ray machine, yet Blondot et. al. still saw the N-rays.

The modern equivalent is people who have adverse effects from cell towers even when the towers are off or who have salubrious effects from magnets even when there is no magnetism. The ability to experience what is not there is astounding.

The last thing that I have learned is that for many people the facts just do not matter. People will hang on to their beliefs no matter what the evidence: derp.

English has no word for “the constant, repetitive reiteration of strong priors”. Yet it is a well-known phenomenon in the world of punditry, debate, and public affairs. On Twitter, we call it “derp”.

Which is to say, a policy commentator is “derpy” when his or her (usually his) prior assumptions about the world are so unwarrantedly strong that he is unswayable by evidence. Derpers have a faith-based approach to policy.

When I asked my youngest son about derp, he groaned and said something to the effect that my mentioning the word was the definition of cringe-worthy and to never do it again. Just a warning for those of you with teenagers at home.

My favorite one star review of my Quackcast in iTunes was:

Harm to the brian (sic).

Didn’t need to listen.

No facts for him. Derp.

And Brian? So Sorry.

So when someone mentions errant information in support of a particular SCAM, your response with some reality based fact will likely go nowhere. I suspect the key word is ‘actually’. Start a sentence with ‘actually’ and you might as well stop there. They do not want to hear what you have to say and will not consider it valid even if they hear it. If ‘actually’ was removed from the skeptical lexicon, we would never get a sentence started in a discussion with a woo believer.

Human nature predisposes us to believe SCAM. You can’t change human nature but you can be aware of its flaws and compensate.

It is, actually, what makes a skeptic and a critical thinker

BTW: Some of this post was cut and pasted from old Keynote presentations without references. I do not think I am plagiarizing other authors, but I am not certain if all the words are indeed mine, but Google searching suggests they are.

Posted in: Critical Thinking, Science and Medicine

Leave a Comment (34) ↓