Shares
The Unpersuadables

The Unpersuadables

We would like to believe people are rational. We would like to believe that if they have formed a false belief based on inaccurate information and poor reasoning, they will change that belief when they are provided with accurate information and better reasoning. We are frequently disappointed.

An example of what should happen

I recently talked with a college professor who believed chiropractic treatment could lower blood pressure. His belief was based on a media report of a chiropractic study. He thought it was plausible that neck manipulation could somehow relieve obstructions to blood flow to the base of the brain, thereby somehow correcting the cause of high blood pressure. I told him that rationale was anatomically and physiologically implausible. I pointed out that the researchers used NUCCA, a form of manipulation that is rejected by most chiropractors. He did not know what NUCCA was. I provided him with information, including links to the study itself and to chiropractor Sam Homola’s excellent critique of the study. My friend changed his mind and thanked me for educating him.

An example of what all too often happens

I was invited to give the “con” side of a pro/con presentation on dowsing to a local discussion group. I lent my opponent my copy of Vogt and Hyman’s classic book Water Witching USA so he would know ahead of time what I was going to say. He read it. The book explains how the ideomotor effect creates the illusion that the dowsing rod moves of its own accord and explains that dowsers have never been able to pass controlled scientific tests. I said as much in my “con” presentation. His “pro” presentation consisted of two arguments: he had personally seen dowsing work, and lots of people believed in it. He didn’t even try to rebut my facts and arguments; he simply refused to engage with them in any way. It was as if he had not read the book and had not heard anything I said. Afterwards, one of the audience was heard to say she would have liked to hear more about how dowsing worked and less about how it didn’t work!

Will Storr investigates

Sadly, some people are unpersuadable. They might as well be saying “My mind’s made up; don’t confuse me with the facts.” We have seen plenty of glaring examples in the comments section of this blog. Will Storr wrote a book The Unpersuadables: Adventures with the Enemies of Science about his struggle to understand the phenomenon. He did a great job of investigative reporting, interviewing people with strange beliefs, spending time with them and also with their critics, and reading pertinent research.

Storr found that discussing evolution with religious fundamentalists was:

…like being a tourist in another Universe… Simple facts and basic logic just don’t work the way I had assumed… facts proved entirely ineffective, and they were ineffective to a spectacular and baffling extent.

The answers they gave to his reasonable questions were often hilarious. If T. rex was a vegetarian, why did he have such big teeth? To eat watermelons.

He found that he liked the people he interviewed. They were not stupid. What made them so unpersuadable? He quickly realized that he answer was not intelligence or education or logic. Something else was going on, but what was it?

He found some clues in recent brain research.

How the brain deceives us

“I know what I saw.” No you don’t, your brain constructed an illusion. We have all seen amazing demonstrations of optical illusions, but we don’t realize we are seeing them constantly. Our brains deceive us. We don’t get the raw data from the retinal receptors; we only get the brain’s interpretation of the data. We can’t even see that we have a blind spot in each visual field corresponding to the place where the optic nerve enters the retina. The brain puts elements of vision together (horizontal lines, contrast, etc.) and constructs a best guess based on what it has learned about the world. We ignore most of our sensory input, becoming aware of things only on a need to know basis. It has been estimated that up to 90% of what you are seeing is constructed from your memories. As babies grow, they interact with the world and their brains develop internal models, ways of interpreting sensory inputs that involve shortcuts and illusions (example: if it looks smaller, it’s probably farther away). The brain builds models of what the world is like, and the senses are then used as fact-checkers, noticing anomalies that don’t match the model. If we receive information that fits well with our internal models, it is readily incorporated; if it differs, we “minimize, distort, rationalize and even hallucinate our way into disregarding this information…we lie to ourselves.”

We would like to believe that we are in conscious control of our actions and thoughts, but we are not. We have no idea why we believe what we believe. Storr illustrates this with a study where people were asked to choose the best candidate for police chief. When told the male candidate was formally educated and the woman was streetwise, they said they thought carefully and chose the man because education was more important for a police chief. When told the female candidate was formally educated and the man was streetwise, they said they thought carefully and chose the man because it was most useful for a police chief to be streetwise.

We have no idea why we do what we do. Split-brain studies and decision-timing studies show that we lack access to the unconscious processes that determine our actions and decisions, and we try to make up plausible reasons after the fact.

We can’t even trust our memories. They are reconstructed every time we access them, and they can become distorted or contaminated with other memories. Psychological studies suggest that about 30% of our memories are false, including some of the ones we are most confident about.

The brain tries to protect us by sustaining a positive self-image. When I do wrong, it was because outside circumstances conspired so I had no other option; when you do wrong it’s because of your internal character flaws. The perpetrators of evil are able to rationalize it so they can believe they are doing something good.

Errors in thinking

We are hampered by prehistoric thinking equipment. We are suckers for a well-told story; attractive plots can sucker us into false beliefs. We are tribal animals who think in binary terms. We divide people into Us vs. Them, and cast Them as villains.

Cognitive dissonance is painful; confirmation bias is comforting. Experience is re-interpreted in such a way that it doesn’t force us to rebuild our internal models of reality. We are all prejudiced, but we need prejudices to function efficiently. They serve as a practical starting point for our guesses about the world.

We are subject to confirmation bias; we get a feel-good “neurochemical kiss” as a reward for confirming a brain model. Confirmation bias serves a purpose. If we had to fairly evaluate every new argument and every bit of new evidence from scratch and constantly rebuild our models, we would become hopelessly overwhelmed and unable to function.

We are adept at ignoring information that contradicts our beliefs. Storr interviewed a woman who told him her auditory hallucinations were caused by sexual abuse she experienced at age 11-12. She then admitted she first started hearing voices at age 9. She was untroubled by the discrepancy. A Morgellon’s patient reported that his symptoms resolved when he simply stopped scratching and picking at his skin. But he still believed he had a mysterious disease that caused strange fibers to emerge from his skin.

High IQ offers no protection. A study asked people to think of as many reasons as possible for and against a belief; the higher IQ people thought of more reasons for their own beliefs, but were no better at thinking of reasons against their beliefs.

Storr visits a Nazi gas chamber with holocaust denialist David Irving. Irving notices that there are handles on the inside of the doors and says it is evidence that people were not locked in and could get out any time they wanted. He does not see that the handles are unconnected to any opening mechanism and there are bolts on the outside of the doors that would have locked the doors closed over airtight seals. What happened in his mind when he saw the bolts? Liar or deluded; evil or mistaken? Can we even hope to understand?

Probably not. Thinking is driven by emotion. Storr concludes that when Irving thinks of empire, he feels joy; when he thinks of Jews, he feels “something else.” All the rest is confabulation.

We are storytellers

Humans are storytelling animals. Our lives depend on stories. They help us make sense of the world, explaining cause and effect and helping us remember. Emotional narrative leads the news; stories that fit well with our feelings are compelling. Unfortunately, stories tend to work against truth; their purpose is not fact but propaganda. A single anecdotal experience like taking a homeopathic remedy and feeling better is interpreted by the brain and turned into an invented story about cause and effect. Science is a new kind of language that fights against the dominion of the narrative. “We can hardly be surprised if some feel an instinctive hostility towards [science], for it is fundamentally inhuman.”

Consciousness is the first storyteller. It tells us that we have free will, provides explanations for our actions, and presents self as hero. Religions and ideologies play into the hero plot since they match up well with the individual’s moral hunches and provide external justification. They validate emotional instincts, provide purpose and a common enemy. They can be useful but can also be dangerous; people have died for false beliefs.

The hero-maker is complemented by a demon-maker. Storr says that Steven Novella sees many practicing homeopaths as psychopathic con artists, while homeopathy proponent Dana Ullman sees skeptics as Big Pharma shills.

Some people accept a belief only if it can be shown to correspond to reality; others accept beliefs just because they are part of a coherent system.

If a person’s set of beliefs all cohere, it means that they are telling themselves a highly successful story. It means that their confabulation is so rich and deep and all- enveloping that almost every living particle of nuance and doubt has been suffocated. Which says to me, their brains are working brilliantly, and their confabulated tale is not to be trusted.

Storr tries to sort out the different stories told by opponents: Richard Wiseman vs. Rupert Sheldrake on his dog ESP experiments, James Randi vs. those who have accused him of lying. He interviews the individuals on both sides and does his best to find “the truth,” but all he finds is two competing stories, and he says “stories are never true.”

On skeptics

We are attracted to others whose models match our own, and in our heroic, altruistic, wonderfully human way, we go out and try to persuade others to change their models to match ours. This is certainly true of religious proselytizers, and Storr thinks it is equally true of skeptics. He makes some harsh observations about the skeptical movement, some more justified than others.

He attends skeptical conferences to observe us. At a QED conference in England, he talks to a skeptic who says homeopathy is bad but admits he hasn’t read any homeopathy studies. He thinks skeptics tend to place an unquestioning reliance on their leaders, just like the followers of any cult. At The Amazing Meeting (TAM) he overhears a non-attendee saying skeptics are like conspiracy theorists.

Storr gets some things wrong

He asks James Randi if he has ever changed his position on anything based on the evidence; Randi can’t think of anything offhand. Storr condemns Randi for not being willing to change his mind, but the fact that Randi hasn’t changed his mind doesn’t mean he is unwilling to. Storr doesn’t offer any examples of subjects about which Randi should have changed his position and didn’t. Certainly no better evidence has surfaced to make skeptics change their positions on things like homeopathy, dowsing, Nostradamus, or psychic powers. I would argue that Randi’s positions are not likely to be proven wrong by the evidence, because he based them on evidence and plausibility in the first place.

He thinks skeptics are too hard on Morgellons patients. He thinks we fall for binary thinking when we assume they all have delusions of parasitosis (DOP). He suggests that maybe some of them have an underlying undiagnosed sensory disorder that causes pruritus. I would argue that we are open to that possibility and would accept evidence that supported it, and I don’t think we are automatically labeling them all with the DOP label.

Storr thinks a lot of skeptics consider ESP so improbable that they don’t bother to read the studies and just reject them out of hand. Perhaps some do, but most don’t. Skeptic Susan Blackmore certainly didn’t; she believed in psi, studied it herself, and even got a PhD in parapsychology before she was eventually forced to give up her beliefs by the studies themselves as she watched the evidence crumble away to nothing when rigorous scientific standards and close scrutiny were applied. Skeptical psychologists like Ray Hyman and James Alcock have spent untold hours analyzing psi studies in great detail and have written about the many specific flaws they found in the studies’ methodology and data analysis. Skeptic Richard Wiseman has criticized Rupert Sheldrake’s research and even repeated one of Sheldrake’s dog ESP studies to show that his best dog subject couldn’t really sense when its owner was coming home. He clearly has an open mind; he says he is not 100% convinced that psi is wrong, only 90%.

Storr doesn’t have the same understanding of placebos as we on SBM do. He gives them more credit and says people believe bogus treatments work “because they do.” Also, he is bothered by the “hard problem” of consciousness and doesn’t seem to recognize the survival advantages of a conscious brain over an unconscious “zombie brain.”

Why do the Unpersuadables try so hard to persuade others?

Why do people have such an urge to aggressively force their views on others, to make them agree to see things the same way? Obviously, we want to confirm the models in our brain and reduce cognitive dissonance, but there is more. Storr thinks skeptics treat belief as a moral choice and fail to realize how little control we have over the ways our brains deceive us. Anyone who calls himself a freethinker betrays an ignorance of the motors of belief. The woman who believes homeopathy cured her is not free to think otherwise.

He says where there is illegality or racial hatred, call the police; where there is psychosis, call a doctor; where there is misinformation, bring learning (I see that as our role on SBM). But where there is ordinary madness and eccentricity, we should celebrate the riches of our species. “To be mistaken is not a sin. Wrongness is a human right.”

As if all this weren’t problematic enough, Storr asks a further thorny question: why do minds sometimes change? Is it because reality intrudes, or because we hear a story we like better?

Conclusion

We are creatures of illusion. To be human is to be “unpersuadable,” at least to some degree about some things. We can’t escape the limitations of our prehistoric brains, but we can learn to constantly remind ourselves that no matter how right we feel, there is a possibility that we might be wrong.

 
 

Shares

Author

  • Harriet Hall, MD also known as The SkepDoc, is a retired family physician who writes about pseudoscience and questionable medical practices. She received her BA and MD from the University of Washington, did her internship in the Air Force (the second female ever to do so),  and was the first female graduate of the Air Force family practice residency at Eglin Air Force Base. During a long career as an Air Force physician, she held various positions from flight surgeon to DBMS (Director of Base Medical Services) and did everything from delivering babies to taking the controls of a B-52. She retired with the rank of Colonel.  In 2008 she published her memoirs, Women Aren't Supposed to Fly.

Posted by Harriet Hall

Harriet Hall, MD also known as The SkepDoc, is a retired family physician who writes about pseudoscience and questionable medical practices. She received her BA and MD from the University of Washington, did her internship in the Air Force (the second female ever to do so),  and was the first female graduate of the Air Force family practice residency at Eglin Air Force Base. During a long career as an Air Force physician, she held various positions from flight surgeon to DBMS (Director of Base Medical Services) and did everything from delivering babies to taking the controls of a B-52. She retired with the rank of Colonel.  In 2008 she published her memoirs, Women Aren't Supposed to Fly.