Shares

sisyphus

As we search for a logo for SBM or the SfSBM, Mark Crislip has been a strong advocate of using an image of Sisyphus, endlessly pushing a boulder up a hill only to have it roll back down again. It’s a bit too self-defeating to be enthusiastic about that suggestion, but it does reflect a common feeling among all of us here at SBM – promoting science can be a frustrating endeavor.

Our frustration reflects a broader phenomenon, that it is difficult to persuade people with facts and logic alone. People tend to prefer narrative, ideology, and emotion to facts. The high degree of scientific illiteracy in the culture presents another barrier.

In recent years psychologists have demonstrated experimentally what we have come to understand through personal experience, that people engage in a host of cognitive defense mechanisms to protect their beliefs from the facts. We jealously guard our world view and are endlessly creative in shielding it from refutation.

A recent series of experiments published by Friesen, Campbell, and Kay in the Journal of Personality and Social Psychology demonstrates that one strategy commonly used to protect our beliefs is to render them unfalsifiable, or at least incorporate unfalsifiable elements.

What they demonstrate is that people will shift from factual to unfalsifiable justifications for their beliefs as needed, depending on the facts with which they are confronted. For example, critics of same-sex marriage were confronted with statistics showing that children raised by same-sex couples either did well or were troubled. When given facts that supported their belief, subjects defended their beliefs on factual grounds. When given facts that contradicted their beliefs, they resorted to moral opinion.

The authors also found that unfalsifiable elements (subjective opinion, moral judgments, etc.) were used to more-aggressively criticize opponents. This strategy was therefore used both offensively and defensively.

This research is in line with previous research, summarized here by Chris Mooney, demonstrating that when people are confronted with facts that contradict strongly-held political opinions, they will tend to strengthen their original opinion. Facts just motivate people to engage in motivated reasoning to defend their position, strengthening it.

People engage in a number of cognitive strategies to deny evidence and maintain strongly-held beliefs. We encounter all of them here at SBM, which is why some people still believe, for example, that magic water can cure anything. As the recent research shows, one such strategy is to make one’s belief partly or entirely unfalsifiable. The examples used by the researchers led to the “moral opinion” approach, but there are others.

In the context of medicine we often encounter the “you can’t test my treatment” defense. Mystical energies are too subtle to be detected by science, for example. When I was on the Dr. Oz show he defended his faith in acupuncture (despite the negative scientific evidence) by stating that Western science does not know how to test this Eastern wisdom and that it was arrogant to think that it could.

This represents a move away from science as a method for knowing which treatments are safe and effective, something also shown by the research. When scientific evidence does not support your world view, this may lead to rejecting science itself.

Another method is to reject only the science that does not agree with you. You can do this by cherry picking the science you like. You can also come up with reasons to dismiss scientific evidence that contradicts your beliefs. In its simplest form this can be, “my experts are better than your experts.” You can always find a crank somewhere, or even a legitimate expert who just holds a minority or contrarian position.

Yet another method is to play the conspiracy/shill/”Big” card. Any scientist or journalist presenting an analysis or fact that goes against the preferred belief is labeled a shill for Big Whatever. If necessary the shill can be part of a massive conspiracy to hide the truth from the public. This is often presented as a cynical and simplistic “follow the money” argument – there is money in X therefore you cannot trust the powers that be.

If you read the comments for articles on medical topics, especially those addressing controversial topics, you will see comments dismissing scientific evidence with broad brushstrokes that simply refer in the abstract to the existence of Big Pharma, shills, the malfeasance of the government, and the corrupting influence of profit. There are also many articles and books by proponents of unscientific belief systems that essentially do the same thing, but in a more sophisticated form, giving details to the broad brush strokes, but still amounting to little more than a dismissal of inconvenient scientific evidence.

Conclusion

The grand social media experiment and formal psychological studies are showing the profound human tendency to prefer existing belief systems over being accurate and correct. Most frustrating is the tendency to engage in motivated reasoning to defend a position rather than alter one’s position to best accommodate existing scientific facts.

The specific mechanisms that people use to maintain their desired beliefs include: incorporating unfalsifiable elements such as moral opinions, subjective judgments, or stating that the phenomenon is not amenable to scientific investigation; dismissing evidence by attacking the messenger as being a shill; invoking conspiracies of “Big” whatever or government malfeasance; cherry picking evidence or experts; naked cynicism; or denying the role of science itself in addressing such issues.

The authors of the recent paper conclude that:

…in a world where beliefs and ideas are becoming more easily testable by data, unfalsifiability might be an attractive aspect to include in one’s belief systems, and how unfalsifiability may contribute to polarization, intractability, and the marginalization of science in public discourse.

This seems like a reasonable concern, and may be the unanticipated dark side of the vast increase in access to information afforded by the internet. In the past simple ignorance was enough to shield one’s belief system from refutation. Up until 20 years ago, if I was engaged in a discussion with a believer in homeopathy, for example, I could state that the evidence shows homeopathy does not work, and they could state the evidence shows it does work. Unless one of us was walking around with review articles in our pocket, that would be the end of it.

Today if you state an incorrect fact, it is highly likely that someone will provide one or more links to references that refute the incorrect fact, or even shove a smartphone in your face with the correct information. This encourages the development of skills that can be used to dismiss facts and the legitimacy of a specific science or science in general. Conspiracy theories, witch hunts, and sophisticated nonsense is therefore also on the rise to counter the threat that the ready availability of facts presents to belief systems. The ironic result is that access to facts may have a polarizing effect, rather than resolving differences.

One possible solution is to teach critical thinking skills to help more individuals transcend this evolved tendency to dig in one’s ideological heels. This is a long and difficult process, of course, but we will continue to push that boulder up the hill.

 

 

Shares

Author

  • Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.

Posted by Steven Novella

Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.