Shares
Jenny McCarthy flaunting her "expertise" at the antivaccine "Green Our Vaccines" rally in Washington, DC in 2008

Jenny McCarthy flaunting her “expertise” at the antivaccine “Green Our Vaccines” rally in Washington, DC in 2008

The major theme of the Science-Based Medicine blog is that the application of good science to medicine is the best way to maintain and improve the quality of patient care. Consequently, we spend considerable time dissecting medical treatments based on pseudoscience, bad science, and no science, and trying to prevent their contaminating existing medicine with unscientific claims and treatments. Often these claims and treatments are represented as “challenging” the scientific consensus and end up being presented in the media—or, sadly, sometimes even in the scientific literature—as valid alternatives to existing medicine. Think homeopathy. Think antivaccine views. Think various alternative cancer treatments. When such pseudoscientific medicine is criticized, frequently the reaction from its proponents is to attack “consensus science.” Indeed, I’ve argued that one red flag identifying a crank or a quack is a hostility towards the very concept of a scientific consensus.

Indeed, I even cited as an example of this attitude a Tweet by Jane Orient, MD, executive director of the American Association of Physicians and Surgeons (AAPS). This is an organization of physicians that values “mavericky-ness” above all else, in the process rejecting the scientific consensus that vaccines are safe and effective and do not cause autism or sudden infant death syndrome (SIDS), that HIV causes AIDS, and that abortion doesn’t cause breast cancer, to name a few. Along the way the AAPS embraces some seriously wacky far right wing viewpoints such as that Medicare is unconstitutional and that doctors should not be bound by evidence-based practice guidelines because they are an affront to the primacy of the doctor-patient relationship and—or so it seems to me—the “freedom” of a doctor to do pretty much damned well anything he pleases to treat a patient.

I’ll repost Dr. Orient’s Tweet:

As I said at the time (a great example can be found here), on the surface this seems quite reasonable, but, as I’ve discussed on many occasions, science is all about coming to provisional consensuses about how the universe works. Such consensuses are challenged all the time by scientists. Sometimes they are shown to be incorrect and require revision; sometimes they are reinforced. That’s how science works.

The reason I brought up this issue again is because I came across a couple of articles relevant to this topic. The first is one by John Horgan, who blogs over at Scientific American, entitled ‘Everyone, Even Jenny McCarthy, Has the Right to Challenge “Scientific Experts”.’ Having a tendency towards snarkiness, my first thought was to simply dismiss this as a straw man argument (at least the title), because I know of no strong defender of science (least of all I) saying that non-experts—yes, even Jenny McCarthy—don’t have the right to challenge experts. When we complain about “false balance,” it’s not because we think that, for example, antivaccine activists don’t have the “right” to challenge the experts supporting the scientific consensus. Rather, it’s because we argue—correctly, I believe—that media outlets all too often present such challenges as falsely equivalent to the actual consensus science being challenged, in essence, putting someone like Jenny McCarthy on or near the same plane as actual scientists. Examples abound and have been discussed on this very blog, embracing many relevant topics, such as influenza, dubious cancer cures, homeopathy, vaccine safety and efficacy, fear mongering about food by Vani Hari (a.k.a. The Food Babe) and many other topics.

So let’s see what Horgan objects to:

Years ago I was blathering to a science-writing class at Columbia Journalism School about the complexities of covering psychiatric drugs when a student, who as I recall had a medical degree, raised his hand. He said he didn’t understand what the big deal was; I should just report “the facts” that drug researchers reported in peer-reviewed journals.

I was so flabbergasted by his naivete that I just stared at him, trying to figure out how to respond politely. I had a similar reaction when I spotted the headline of a recent essay by journalist Chris Mooney: “This Is Why You Have No Business Challenging Scientific Experts.”

Mooney is distressed, rightly so, that many people reject the scientific consensus on human-induced global-warming, the safety of vaccines, the viral cause of AIDS, the evolution of species. But Mooney’s proposed solution, which calls for non-scientists to yield to the opinion of “experts,” is far too drastic.

Oddly enough, the article by Chris Mooney cited by Horgan isn’t particularly recent. It’s close to 10 months old. Be that as it may, Yes, that hapless student did have a rather naïve attitude, but what he said is not quite the same as what Mooney argued, Horgan’s conflation of the two notwithstanding. Let’s put it this way. There’s a not-insignificant difference between saying “you have no business challenging scientific experts” and “you have no right to challenge scientific experts.” The first is a warning to lay people and people without the appropriate expertise about why they should be very careful challenging a scientific consensus without saying that they have no right to make such challenges. What Mooney calls for is the recognition that there is such a thing as expertise and challenging it requires more than just a Google education.

Mooney’s article relied heavily on the viewpoint of Professor Harry Collins of Cardiff University, who runs the Centre For The Study Of Knowledge Expertise Science at the Cardiff School of Social Sciences. Specifically, Mooney was discussing a book by Collins entitled Are We All Scientific Experts Now? In the book, Collins lays out a robust defense of scientific expertise. According to Mooney, Collins is known for his investigation in which he was embedded for over a decade within the community of gravitational wave physicists to the point where he became so familiar with their culture that he was actually able to trick physicists into thinking he was one of them. During his time he refuted several myths about science, such as the “eureka moment” and the idea that scientists always follow the data where they lead when in fact sometimes they cling to established paradigms in the face of new evidence. As Mooney put it, the upshot was that “while the scientific process works in the long run, in the shorter term it is very messy—full of foibles, errors, confusions, and personalities.”

I’ve said almost exactly the same thing myself on more occasions than I can remember. In the short term, science can be incredibly messy. Early results that seemed promising often undergo a “decline effect” and appear less solid. In medicine, in particular, physicians and scientists sometimes cling to old paradigms longer than they should in the face of new evidence. On the other hand, because human lives are at stake, this is somewhat understandable. Because being wrong about new findings in medicine can cause actual harm to human beings, physicians tend to be conservative and need a lot of convincing. Sometimes this tendency goes too far. Indeed, we have a bit of a joke in medicine that no medical treatment is ever entirely abandoned until the last group of physicians who trained when it was the standard of care has retired or died off. It’s actually not quite that bad, and doctors do pay attention to negative studies. There are lots of things we as surgeons did in the 1990s as part of breast cancer treatment, for instance, that we no longer do and things that we didn’t do then (and hadn’t even thought of yet) that we do now. There’s also a countervailing tendency among physicians to “jump on the bandwagon” of new treatments before they’re properly validated, sometimes for marketing advantage, sometimes just to be a trailblazer. Laparoscopic cholecystectomy in the early 1990s comes immediately to mind as an example of this tendency.

Still, as messy as science is, I agree with Mooney that in the long term the scientific process works. I also like the Collins’ concept of the Periodic Table of Expertises, described by Mooney:

Read all the online stuff you want, Collins argues—or even read the professional scientific literature from the perspective of an outsider or amateur. You’ll absorb a lot of information, but you’ll still never have what he terms “interactional expertise,” which is the sort of expertise developed by getting to know a community of scientists intimately, and getting a feeling for what they think.

“If you get your information only from the journals, you can’t tell whether a paper is being taken seriously by the scientific community or not,” says Collins. “You cannot get a good picture of what is going on in science from the literature,” he continues. And of course, biased and ideological internet commentaries on that literature are more dangerous still.

Interactional expertise requires deep experience with a specialty of the kind that can’t be simply learned by reading the literature; in essence, it involves, to some extent or other, actual experience studying and doing research in the relevant specialty. Our professors in medical school often pointed out that at least half of what we were being taught will be incorrect in a decade. Whether that exact estimate is true or not is not critical to the main point, which is that medicine and medical knowledge changes fairly rapidly as new scientific findings are reported and that we, as physicians, have to learn to adapt and integrate these new findings into our practices. Because it’s not uncommon for keeping up with the latest literature to be too much for one person to do without help, we rely on the society of physicians and medical scientists. One way we do this is to attend medical and scientific conferences relevant to our specialty. (For example, I will be going to Houston later this week to attend the Society for Surgical Oncology meeting, and in April I will be attending the American Association for Cancer Research annual meeting in Philadelphia.) We form professional societies who gather groups of experts together to produce and periodically update guidelines based on the best existing evidence.

To show the difference between literature knowledge and interactional knowledge, Collins uses this analogy:

The next step after popular understanding is the kind of knowledge that comes with reading primary or quasiprimary literature. We will call it ‘primary source knowledge.’ Nowadays the internet is a powerful resource for this kind of material. But even the primary sources provide only a shallow or misleading appreciation of science in deeply disputed areas though this is far from obvious: reading the primary literature is so hard, and the material can be so technical, that it gives the impression that real technical mastery is being achieved.

It may that the feelings of confidence that come with a mastery of the primary literature is a factor feeding into the ‘folk-wisdom view’ [the view that ordinary people are wise in the ways of science and technology]. But any amateur trying to apply the knowledge gained from car-repair manuals will soon learn the bitter lesson that much less can be done as result of reading information than appears to be the case. The same applies to doctoral students in the sciences; their first experience of real research is usually a shock, however well accomplished they have become in reading the published literature and doing well-rehearsed experiments as undergraduates. But even experienced scientists tend not to understand the amount of tacit knowledge on which their abilities depend. Thus, studies of tacit knowledge transmission show, inter alia, that scientists will embark confidently on an experimental project having done nothing more than read the literature and only later discover the degree of joint practice and/or linguistic socialisation that is needed to make a success of it (to generate the capacity to do the thing rather than talk about it).13 Given trainee scientists’ experiences, and professional scientists’ lack of reflective appreciation of their own tacit knowledge, it is no surprise that a member of the public encountering the professional journals or the internet might easily come to think that they have found a direct line to understanding.14

Footnote 14 brings this concept home to medicine:

14. A familiar image is today’s informed patient visiting their doctor armed with a swathe of material printed from the internet. While this kind of information gathering, especially in the context of a support or discussion group, can be valuable, it is important not to lose sight of what sociologists have shown: a great deal of training and experience is needed to evaluate such information. Sociologists of science seem to forget the lessons of their own subject rather easily.

Exactly.

None of this is to say that those who haven’t reached the level of interactional knowledge about a subject don’t have the right to criticize scientific consensuses within that subject. It does, however, warn those who would be critical to avoid the hubris of thinking that their popular science or even primary literature knowledge is sufficient.

It’s also important to remember that there are scientific consensuses and then there are scientific consensuses. What I mean is that some consensuses are stronger than others, something Horgan seems to ignore or downplay. For example, he seems quite pleased with himself when seemingly he got something right that Stephen Hawking got wrong. Last year, cosmologists overseeing a project called Background Imaging of Cosmic Extragalactic Polarization 2 (BICEP2) reported the “first direct evidence” of inflation, a theory which says that the universe went through a period of extremely rapid expansion right after the Big Bang. Hawking had made a bet with another scientist, cosmologist Neil Turok, director of the Perimeter Institute in Canada, that gravitational waves from the first fleeting moments after the Big Bang would be detected by BICEP2. He had even declared victory on the BBC, leading Horgon to nearly strain his shoulder and elbow patting himself on the back when Hawking turned out to be wrong:

No less an authority than Stephen Hawking declared that the BICEP2 results represented a “confirmation of inflation.” I nonetheless second-guessed Hawking and the BICEP2 experts, reiterating my long-standing doubts about inflation. Guess what? Hawking and the BICEP2 team turned out to be wrong.

I’m not bragging. Okay, maybe I am, a little. But my point is that I was doing what journalists are supposed to do: question claims even if–especially if—they come from authoritative sources. A journalist who doesn’t do that isn’t a journalist. He’s a public-relations flak, helping scientists peddle their products.

Here’s the thing. There’s a huge difference between a well-settled scientific consensus and cutting edge cosmology. Yes, Stephen Hawking is undeniably an expert, but he was expressing a scientific opinion on a matter that was (and is) not at all close to settled science. That he turned out to be wrong is not shameful. It turns out that the BICEP2 investigators teamed up with another group of scientists from the European Space Agency to analyze the data from BICEP2 and ESA’s Planck satellite and found that the previous analysis had overlooked factors that could produce a false positive. Again, this is the sort of reversal that is not uncommon when scientists are doing research at the bleeding edge of scientific discovery.

It is a very different thing than the science that tells us homeopathy can’t work, that vaccines are safe and effective, and that energy healing is more magic than science. Moreover, as one commenter pointed out to Horgan, the “first meaningful (i.e. based on actual scientific arguments) doubts about the BICEP2 results I saw were on the blogs of professional cosmologists, and not issues raised by journalists themselves (although yes, they were later reported by journalists).” Also, as another commenter pointed out, just because the BICEP2 team was wrong about the interpretation of their experiment doesn’t necessarily mean they were wrong about inflation. That commenter also pointed out (as I just did) that inflation is not yet consensus theory.

Horgan goes on to argue that “it’s precisely because we journalists are ‘outsiders’ that we can sometimes judge a field more objectively than insiders.” Well, yes and no. It might well be possible for an outsider like Horgan to judge conflicts of interest, disclosed or undisclosed, better than “insiders” can, as in the case of pharmaceutical funding and influence on drug development. Indeed, just check out Brian Deer’s excellent work outside his other excellent work exposing Andrew Wakefield’s scientific fraud (e.g. his work on the TGN 1412 clinical trial and ). Just look at what Ben Goldacre has accomplished. That being said, unless Horgan has reached the level of interactional knowledge, his “insights” about a scientific field, in particular the scientific consensuses within that field, should not be treated the same as those of real experts.

Unfortunately, Horgan rather misses the point in his conclusion:

Google is reportedly working on algorithms for evaluating the credibility of websites based on their factual content. But there will never be a foolproof way to determine a priori whether a given scientific consensus is correct or not. You have to do the hard work of digging into it and weighing its pros and cons. And anybody can do that, including me, Mooney and even Jenny McCarthy.

By the way, I think McCarthy grossly overstates the dangers of vaccines–I’m glad my kids got vaccinated–but I, too, have concerns about some vaccines.

Reading Horgan’s article referenced in the last sentence, “Michele Bachmann Wasn’t Totally Wrong about HPV Vaccines“, made me cringe. Although Horgan does point out that he didn’t believe Bachmann’s ignorant blather about Gardasil causing mental retardation, he does seem to conflate questions of Merck’s marketing strategy for Gardasil with whether a mass vaccination campaign to prevent HPV is worth the expense and bother, seeking to poison the well about Gardasil based on distrust of pharmaceutical marketing instead of concentrating on the scientific and medical merits and disadvantages of the vaccine.

A number of commenters had excellent retorts to Horgan’s argument that anyone, including Jenny McCarthy, Chris Mooney, and yes, John Horgan, can do the “hard work of digging into” a scientific consensus and “weighing its pros and cons”: And anybody can play basketball, including me, Michael Jordan, and Stephen Hawking.

Or, with the country being in the thick of March Madness and all:

People can pretend to be scientists. That doesn’t make them scientists. They can forget that they failed high school math, majored in art history in college, and became an actor. I can also forget that I did not play for the Tar Heels and the Bulls, but basketball sure is fun. Expertise takes a lot of hard work. Someone claiming they have expertise when they don’t is arrogant. James Inhofe melting a snowball is not expertise in climatology. Call it a gimmick, but it was a gimmick that showed he did not know the difference between global climate and local weather. Yes, he does have the right to his beliefs, but others have the right to laugh at his beliefs as being profoundly ignorant.

Precisely. This is where Horgan again misses the point. It’s not about people without expertise not having the “right” to question a scientific consensus. Clearly, a fair reading of his article indicates that not even Mooney meant that when he said you have “no business” questioning scientific consensus. (Besides, as I recently learned when I was published in Slate.com, it’s usually the editor who comes up with the headline, not the writer.) Rather, it’s about how that consensus is questioned. When it’s questioned, as Jenny McCarthy and antivaccinationists question scientific consensus, using misinformation, pseudoscience, cherry picked studies, and misinterpretation of other scientific studies, such “questioning” devolves into denialism and should be called out. In other words, how one questions a scientific finding matters. A lot.

Finally, it’s also about how much the questioning of a scientific consensus by a non-expert should be valued. Someone like Horgan might have a modicum of credibility questioning a scientific consensus based on his experience as a science journalist, particularly when we’re talking about something that isn’t a particularly strong consensus (inflation), if it’s even consensus at all. Someone like Jenny McCarthy, with no relevant expertise even reaching the level of “literature knowledge,” has no credibility at all. Having people like me say so and people like Mooney saying that she has “no business” making such pronouncements is simply the price she pays for parading her ignorance to the world, particularly when her ignorance contributes to real degradations in public health through increasing numbers of parents not vaccinating their children. In the end, what Horgan seems to be arguing is that we should take pseudoexpertise seriously. I disagree.

Related posts:

  1. Hostility towards scientific consensus: A red flag identifying a crank or quack
  2. Science-based medicine, skepticism, and the scientific consensus
  3. Pseudo-expertise versus science-based medicine
  4. The “decline effect”: Is it a real decline or just science correcting itself?

 

 

Shares

Author

Posted by David Gorski

Dr. Gorski's full information can be found here, along with information for patients. David H. Gorski, MD, PhD, FACS is a surgical oncologist at the Barbara Ann Karmanos Cancer Institute specializing in breast cancer surgery, where he also serves as the American College of Surgeons Committee on Cancer Liaison Physician as well as an Associate Professor of Surgery and member of the faculty of the Graduate Program in Cancer Biology at Wayne State University. If you are a potential patient and found this page through a Google search, please check out Dr. Gorski's biographical information, disclaimers regarding his writings, and notice to patients here.