Articles

Medical training versus scientific training

This month I will begin my third year of medical school, after a three-year break for laboratory research. Living alternately in the worlds of med school and grad school has prompted me to reflect on differences between these training programs.

[Obvious disclaimer: I have studied at a single institution, and only for five years.]

I am enrolled in a dual-degree MD/PhD program. About 120 US medical schools have such programs, and the National Institutes of Health funds a third of them (MSTP). The schedule of such programs is generally: 2 years of medical school (culminating in USMLE Step 1), 3+ years of graduate school (culminating in dissertation and PhD), and then the last 2 years of medical school (which I begin this month). The most popular residency choices for MD/PhD graduates are internal medicine, pediatrics, and pathology (match data). Other residencies that attract these graduates include dermatology, neurology, ophthamology, and radiology (survey data). The hopes of those funding the MD/PhD training programs and of those accepting the graduates is that these individuals will become physician-scientists, bridging the divide between lab bench and patient bedside with insights from both.

[Note that the majority of physician-scientists do not receive dual degrees but rather are MD physicians who began conducting research after residency. Degrees are imperfect signals of scientific acumen, as SBM readers surely know.]

One tricky bit during MD/PhD training (and presumably also during the career of a physician-scientist) is transitioning between lab and clinic. One of the feared jumps is what I am facing right now: after years of focus on my thesis project, I will begin clerkships with third-year med students fresh from their USMLE board exam. Of course I forget much of the detailed knowledge about drugs and diseases that I accumulated for the board exam, which I took in 2006. My clinical skills, rudimentary as they ever were, are undoubtedly quite rusty. Just to top it off, all my friends who shared first and second year with me are now residents, so I know few of my new classmates!

As I transition back to med school, I am reflecting on my grad school experiences and how they will support my further training. The details of my research project will not likely lead to any breakthrough cures, but by participating in the endeavor of science I am beginning to truly appreciate the painstaking process by which scientific knowledge is generated. My experiences in the lab have taught me the risk of relying on a transitory positive finding before it is replicated. From designing my own experiments, basic they may have been, I can recognize how challenging it must be to properly study a complex intervention in human subjects by asking focused questions and adhering to rigorous methodology. In graduate school, students are trained to ask not just “what do you know?” but more importantly “how do you know it?”

On the other hand, first and second year of medical school are largely concerned with teaching the basic facts and models that undergird modern medicine. Courses such as the biochemistry and physiology are often referred to as “science” courses, but in general these courses teach the results of science without dwelling much on the process of science. This focus on facts is understandable; physicians cannot function without knowing the languages of genetics, anatomy, pharmacology, and the like. But I wonder if an adverse effect of these years of rapid fact accumulation is an overly trusting attitude towards authorities. Most of the curriculum in years 1 and 2 of med school represents the consensus that forms the knowledge base of medicine. It can be learned not just from textbooks and lectures but from review books and even Wikipedia at times. A lecturing professor is not challenged; rather, her words are dutifully transcribed and memorized for the test. And safely so! Lectures rarely focus on controversy or uncertainty. Therefore, the first half of medical school imparts a great amount of requisite knowledge, but it does not do much to encourage scientific skepticism.

[CAM tie-in: Some training programs for alternative practitioners describe similar "science classes" to those taught in medical school. Even assuming high quality, such classes (in medical school, naturopathy school, whatever) are more about teaching dogma than critical thinking. Physiology dogma is more useful than homeopathy dogma, to be sure, but such coursework is nothing like the training of a scientist.]

The good news is that third year looks designed to begin fostering independent learning and critical thinking in a clinical setting. My new med school classmates and I were advised, “Don’t complain that ‘no one is teaching me.’ The physicians’ top priority is the patient, not you. Take charge of your own learning.” Of course there are still a lot of new facts to learn (and in my case, old facts to remember), so I’ve already bought a ponderous textbook for my first clerkship. But the book-learning is supplemented with apprentice-style participation in care for patients. Ultimately, it seems to me, the challenge of medicine is the thoughtful application of population data to the care of the individual, and I expect that third year will begin my training in this art.

I recognize that we neither need nor can afford to train all medical students to be physician-scientists. I also recognize that the pre-clinical years of undergraduate medical education have little room for learning goals beyond mastery of basic facts. But I wonder, particularly as I read SBM reports of sloppy thinking and bad science permeating medicine and academia, if a shift of focus from “what we know” to “how we know” might be in order. In the age of smart phones and handheld computers, instant recall of obscure facts by human brains may be slightly less important than it once was. Web-savvy patients come to the doctor with facts and myths and worries. The physician of the future surely cannot know everything but must be able to evaluate anything.

I am halfway through Snake Oil Science by R. Barker Bausell (reviewed by Dr. Hall). The lessons of this terrific book are what I wish I had found more of in med school so far.

Posted in: Medical Academia, Science and Medicine

Leave a Comment (20) ↓

20 thoughts on “Medical training versus scientific training

  1. Danio says:

    Excellent post, Tim. I think it’s a common flaw of our education system that we put way too much emphasis on the memorization/regurgitation of some density of ‘truths’, at the expense of actually teaching people *how to think*. The fact that someone with your critical thinking skills has chosen to undergoing training in this manner is really heartening. Best of luck easing back into the clinical realm!

  2. manixter says:

    I would adhere to the “when in rome…” recommendation. Because, of course, you’ll be expected to learn an unpredictable mix of “gold standards” (that is, USMLE-approved) and site-specific, specialty-specific, attending-specific, standards.
    Once you know the accepted standards of care, as they are written and sung, you will have the luxury of digging your spade in randomly and finding out where the evidence gaps lie. Hopefully, the clinical experience you’ll have had by then will guide you as to how to act in the absence of good evidence (a condition that seems to be more common than not).
    However, the basics always apply:
    Know your patient (the patient is your teacher. Spend as much time with them as possible– every interview is a lecture, every chart a textbook)
    Know yourself– particularly the limits of your own knowledge

    Being able to separate knowledge into “I know” and “I’ll find out” instead of “I win” and “I lose” will, in the long run, serve you very well, no matter how “rusty” your skills.

  3. DVMKurmes says:

    Definitely an excellent post. It also reflects the same flaw in veterinary medical education. I was fortunate to have a couple of excellent teachers as an undergrad before vet school, and had the opportunity to work in several labs for periods of several months to 2 years. I noticed a difference in how many of my veterinary school classmates thought about and learned things and how the few of us that had more rigorous science backgrounds did. Unfortunately, the competitive nature of getting acceptance to professional schools as it is now, does more to encourage rote memorization and test-taking skills than it does to teach critical thinking ability.

  4. overshoot says:

    Pardon me for showing my age, but long ago one of my professors delivered a memorable rant that’s stuck with me. He was, to put it mildly, not pleased with the (then) emerging trend of referring to “University training.”

    His point, condensed, is that the product of education is understanding, where the product of training is skill. Obviously clinical medicine requires skill and thus training, but I fear that calling the academic side “training” blurs the very distinction that you are trying to reintroduce.

  5. Newcoaster says:

    Excellent post, Tim (from a fellow Tim) And ironically I am currently preceptoring a 3rd year med student doing his month of rural experience before being unleashed on the wards this fall. In talking with him about AltMed, it seems that critical thinking is lacking, and that most AltMed claims are accepted on face value. F

    Doctors use applied science, but we have rarely done science. I did a little research in some of my undergrad and summer jobs, but I would never call myself a scientist. I read a lot about different kinds of science, and I understand the scientific method, but I’m not a scientist.

    Even more concerning is there is less emphasis in medical school admissions these days for much basic scientific undergrad experience. For example, I spent 6 years noodling around university and obtained two different BSc degrees before applying to med school, but many of my classmates just had BA’s with rudimentary general science courses. I’m not sure how they survived biochemistry in 1st year, other than rote memorization.

  6. daijiyobu says:

    Great post.

    Per “CAM tie-in: some training programs for alternative practitioners describe similar ‘science classes’ to those taught in medical school. Even assuming high quality, such classes (in medical school, naturopathy school, whatever) are more about teaching dogma than critical thinking.”

    I agree, and I did the ND route but stopped after four years due to what I now-a-days call the naturopaTHICK: absurd sectarian dogmas ‘masquerading / falsely labeled as’ modern science.

    Much of higher education is shallow fact-regurgitation / mimicry, as apposed to deep active-learning / authenticity.

    Isn’t it interesting that the AANMC-AANP ND crowd actually claim MORE such science coursework of the ‘science facts [but not process]‘ variety:

    e.g. more BMS and clinical science that Stanford or Yale med. schools, board exams whose parts are both titled “science” [with the pt. 2 containing homeopathy]

    http://www.ohsu.edu/cam/for_docs/curri/natur_med.pdf

    And this MUST be true, because it is written by NCNM’s ND School Dean / “CMO” Meletis, and it is hosted by Oregon Health & SCIENCE University!

    And per “the physician of the future surely cannot know everything but must be able to evaluate anything”, what’s doubly ironic is

    a) that that OHSU-NCNM pdf I’ve cited above has that wacko Edison quote that NDs love to echo which begins with much the same language…

    “the doctor of the future will give no medicine [blah blah blah]”

    b) that NDs love homeopathy…truly ‘not giving any medicine’.

    -r.c.

  7. Joe says:

    Tim, and commenters, this is excellent.

    You are right that “science” is taught as facts, rather than as a method- in college and medical school. I think it has to be that way because time constraints don’t permit logical development of the field (my field is chemistry). In other words, if I teach chemistry from the ground up, I will need a lot more time to produce a student who’s skills are in demand at graduation. It will make a chemistry degree require more credits (semesters, money) unless we drop requirements in general education.*

    When you have some leisure time (in, say, 30+ years), you might appreciate Carl Sagan’s book “Demon Haunted World.” In it, he opines that primary- and high-school students have the inquisitiveness and critical-thoughtfulness squeezed out of them. That’s when they learn to shut up and take notes; instead they need to have those qualities nurtured. Unfortunately, today, “No Child Left Behind” (NCLB) forces teachers to “teach to the test” and they don’t have the time to properly demonstrate the scientific method (because it isn’t on the tests). NCLB must be repealed.

    Yes, it is a problem that so many people, med-students included, have been trained to note and record what anyone in a suit says.

    @ daijiyobu, “Isn’t it interesting that the AANMC-AANP ND crowd actually claim MORE such science coursework of the ’science facts … ” I agree, [and the added the italics "claim" are mine].”

    As I think you are suggesting- claims about coursework are not probative- after all, we have accredited colleges of astrology. Also, we have accredited colleges of chiropracty whose products assert that they have studied basic science, and then believe in subluxations.

    * A colleague of mine went to an Institute of Technology for his BS, and asked me to look over his first grant application. I said it was very hard to read and he replied that my reaction is what he feared, he had not been required to take English course in college.

  8. mxh says:

    As I get back into the third year after finishing grad school, I completely agree with you about the “what we know” vs “how we know” methods of teaching. The problem with the current method is that things change rapidly and if students have only memorized the answers without understanding why, they’ll have a lot of trouble catching up. I’m so ingrained in the culture of science and asking “how” and “why” are so natural for me that I feel I’ll have quite a bit of trouble fitting in with the rest of the third years. Hopefully, it won’t hurt me too much (I guess I’ll find out in two days).

  9. The Blind Watchmaker says:

    Hopefully, students should be learning the “how we know” stuff in grade school, or at latest in high school. If I had to choose one thing to fix about the public education system, I would start teaching critical thinking in first grade. Politically, I don’t think that this would fly in the U.S.

    I enjoyed your post. Medical students, I think, have a basic understanding of how the (so-called) “dogma” got into the texts that they are memorizing. The texts present the consensus of data, facts and theories brought about by science. These form the foundation from which they will build and explore their future careers.

  10. rajeshm6 says:

    really very thoughtful post ! i really loved the topic of discussion medical training and scientific training! U (ADMIN) is really a sort of inspiration for my blog – http://www.medicine2life.com , these posts really fascinated me and made me think innovative!

  11. daedalus2u says:

    Critical thinking really should be taught earlier than college. It should be an integral part of school from kindergarten. I think it is much more important than rote memorization of facts, and much more difficult to teach and to measure through testing.

    I think that teaching critical thinking is in conflict with the top-down authoritarian hierarchy that many teachers and parents feel is appropriate. The goal of a “good” teacher is for the student to acquire and then build-on and surpass the abilities and achievements of the teacher. Putting so much emphasis on competition between PIs for funding creates a disincentive to having students surpass their mentors.

  12. Tracy W says:

    This post appears to assume that there is a trade-off between teaching critical thinking and teaching facts. But as far as I can tell the evidence from cognitive science is that critical thinking is dependent on mastery of a good deal of facts.

    The research from cognitive science is that, critical thinking is not a skill like riding a bike, which you learn once, and then can apply in multiple situations. Critical thinking is entwined in content (called domain knowledge). So you can remind a student that they should look at an issue from multiple perspectives so often that the student will learn to do so, but if a student doesn’t know much about an issue, they can’t think about it from multiple perspectives. Courses that aim to teach students to think critically, in the absence of a particular content area, have some value, but it’s limited. See http://www.aft.org/pubs-reports/american_educator/issues/summer07/Crit_Thinking.pdf

    Web-savvy patients come to the doctor with facts and myths and worries. The physician of the future surely cannot know everything but must be able to evaluate anything.

    But how do you evaluate anything if you don’t know a lot of facts?
    Look at the other posts you write here, for exampe your previous one on 12 June at http://www.sciencebasedmedicine.org/?p=522
    Here you write about:
    – what it means when you call a journal “highly ranked”, as a way of criticising Begley’s claim that a poorly ranked journal is considered altogether inferior.
    – you drew on your knowledge about the relationship between academia and NIH grants to criticise Begley’s attribution of responsibility
    – you drew on knowledge of the elusiveness of positive results.

    If you knew nothing about how medical research funding works, how would you have known how to evaluate the paper?

    Or take medical knowledge more generally. Why should medical research be blinded? Because of the placebo effect – an empirical one. Why is double-blinding useful? Because the staff carrying out the medical treatments are also subject to human cognitive biases. What other possible explanations could explain a positive result from an experiment? That 5% of the time experiments will produce a significant result by pure chance. Knowing “how we know” about various human cognitive biases is valuable information, but to evaluate new scientific experiments we need to also know what those cognitive biases actually are.

    Furthermore, the more you know in any field, the more easily you can learn additional knowledge in that field. Every piece of writing or speaking makes some assumptions about the information the reader already has. The brain has to connect the information coming in with the right information already in the brain to make the connections (for example, here I just assumed that you know that the brain is the thinking organ in the body). The more familiar you are with facts, the faster these connections can be made, and thus the faster you can absorb new information in the world outside you. See http://www.aft.org/pubs-reports/american_educator/issues/spring06/willingham.htm for a more detailed explanation of this, with some experimental evidence that you can test out on yourself right there while reading it.

    Perhaps medical schools should be teaching more critical thinking. But if they’re going to do this well, this means teaching more of “what we know”, not less.

  13. Tracy W:

    I don’t think Tim is implying that facts are unimportant. On the other hand, you reference Tim’s own writing. I can attest from my own personal experience, that the majority of medical students at my institution just don’t have the critical skills training to have done that kind of evaluation.

    Again, anecdotally from my experience, alot of the critical thinking training we do get isn’t formalised. We tend to pick it up on the wards, IF we have an attending who tries to make us critically think. I’ve been lucky, and I’ve had attendings who have seen my interest in SBM and cultivated it by forcing me to think critically, often. This isn’t everyone elses experience though, it’s certainly not uniform education.

  14. Joe says:

    @Tracy W,

    I find your post interesting. If I understand you correctly, you agree with me that teaching of facts is needed, rather than reducing them in order to spend time on the process.

    The reason we need to teach critical/scientific thinking, earlier than college, is because without it, every lecturer and technical subject looks the same. For example, the CAM advocates that speak at Tim’s med school look just like the legitimate medical instructors. As a result, the students simply add that to their knowledge base.

  15. Tracy W says:

    Whitecoattales, I was responding to the statement:
    “But I wonder, …, if a shift of focus from “what we know” to “how we know” might be in order. ”

    My understanding of the relevant cognitive science is that if you want to teach critical thinking well you have to teach what we know as well as how we know it.

    I can attest from my own personal experience, that the majority of medical students at my institution just don’t have the critical skills training to have done that kind of evaluation.

    Well they are students. It takes time and practice to become an expert. In the link I provided earlier to the cognitive scientist Dan Willingham he talks about “deep knowledge”, versus surface knowledge. “Deep knowledge” is the ability to recognise the deep structure underlying a surface problem, for example how the law of evolution explains such varying facts as the shapes of finches’ beaks and the arrival of drug-resistant TB. And as far as cognitive scientists know, it’s a slow process creating this sort of deep knowledge for most people. As Tim himself says in this post, he’s had a 3 year break doing lab research, so he’s had that time to learn a lot of facts about lab research, which counts for turning it into deep knowledge. (Also quite possibly he’s one of those incredibly smart people who can pick up the deep knowledge very rapidly.)

    As for a lot of critical thinking training not being formalised, I guess it means how you define “a lot”. My point is that teaching facts is a vital part of teaching critical thinking, and is also part of teaching students how to learn in the future, so we should classify classroom time spent teaching relevant medical facts as “critical thinking” training time, even if the teacher doesn’t give a fig about whether the students think critically.

    I am not opposed to teachers asking students to also display critical thinking about those facts while they teach those facts. That’s probably better for critical thinking skills than just teaching facts. But the learning facts bit is a vital part of learning to think critically. The two areas are not that separable.

  16. Tracy W says:

    Joe, I don’t think it’s as simple as that. Part of my point is that critical thinking is dependent on relevant facts. For example, a lot of CAM advocates report experiments which are unreliable because they don’t properly control for the placebo effect. But other areas of science have different methodological problems. Many creationist attacks on evolution rely on their audience not knowing the difference between “random mutations” and “random mutations which are then filtered by survival”. This doesn’t strike me as having much to do with the placebo effect. My own background is in electrical engineering and economics. A number of free energy experiments are wrong becuase the operators measure an AC power output with a DC power meter, which just leads to non-meaningful results. This fault doesn’t strike me as having anything to do with the placebo effect or creationists. Much criticism of economic theory is flat-out wrong, people state that economists assume people only want to maximise income when every Econ 101 course I know about includes teaching the leisure/labour trade-off which concludes that there’s no theoretical basis for predicting a person’s labour supply response to a change in wages. Again, this economics-woo has no relationship to the other sorts of pseudo-scientific thinking I’ve listed.

    I don’t think we can possibly teach the factual background to enable high school students to think critically about every science. And then on top of that, there’s the misuse of history and statistics as well.

    Teaching the scientific method at high school is probably a good thing, but teaching “scientific thinking” is too broad. Unless of course you know of some brilliant way of shoving a lot of facts into every high school student’s mind.

  17. Joe says:

    @Tracy W,

    I think you a right. What is really needed today is for each medical school to require a course along the lines described by SBM’s Wally Sampson http://www.ncbi.nlm.nih.gov/pubmed/11242574?ordinalpos=3&itool=EntrezSystem2.PEntrez.Pubmed.Pubmed_ResultsPanel.Pubmed_DefaultReportPanel.Pubmed_RVDocSum

    However, given the invasion of quackademic medicine as described by blog-friend Orac http://scienceblogs.com/insolence/2009/03/noooo_not_quackademic_medicine_at_my_old.php I doubt it will happen.

  18. Tracy W says:

    Hi Joe – that sounds sensible, well assuming that the course actually results in students learning to think critically in the medical domain. :)

  19. bbjessicab says:

    im doing a paper for my english 102 class about spiritual healing vs medical care, and i was hoping to get some statistics that i havent had any luck in getting elsewhere. im trying to find out how many people (either globally or nationally) practice spiritual healing, because of their religious beliefs, and also how many people (including adults) die because they refuse medical treatment. i would appreciate any thoughts or numbers. thanks!!

Comments are closed.