Shares

[Editor’s note: With Dr. Gorski enjoying a vacation to recharge his batteries, we present a second offering from contributor James Thomas. Enjoy!]

From the

This might hurt a little…
From the Wellcome Trust Image Librar via the Wikimedia Commons

Advocates of CAM* (Complementary and Alternative Medicine) have long argued that mainstream medicine is a dangerous undertaking using toxic drugs and invasive interventions that often do more harm than good, while the various quackeries huddled under the CAM umbrella are said to use natural interventions that aid the body in healing itself. A recent BMJ article naming medical errors as the third leading cause of death in the United States was trumpeted as proof of that claim and predictably enough unleashed a maelstrom of pearl-clutching commentary from the CAMsters. David Gorski has already deconstructed the ‘medical error is the third leading cause of death’ argument comprehensively. Rather than re-till that ground, I will use this essay to examine the allied accusation that medicine has little interest in the actual inherent dangers of medicnie, and less interest still in addressing those dangers.

Make no mistake about it, medicine sometimes uses powerful drugs, deeply invasive procedures, and blasts of ionizing radiation. The patients for whom these interventions are used are ill, often desperately ill, and those interventions may be all that stand between the patient and chronic disability or an early and unpleasant death. Sometimes interventions fail, sometimes there are failures to rescue, sometimes diagnoses are wrong, sometimes errors are made. But medicine is not blind to errors when they occur and devotes substantial resources to preventing them. Contrary to the tale of indifference spun by the CAM community, the story of safety and quality improvement efforts in medicine is vast, far more than can be told in an essay of a few thousand words. So rather than sketch out broad outlines of quality initiatives throughout medicine I have chosen to tell the story as it played out in a single medical specialty.

The dawn of anesthesia

Surgery has existed in one form or another for several thousand years. It appears that one of the earliest surgical interventions was amputation, described in writing perhaps first in the Sushruta Samhita in the first millennium BCE. Trepanning too has a long history with some archeological evidence of the practice in India some 4,000 years ago. And legend has it that Julius Caesar was delivered in the first Cesarean section, though that account is almost certainly apocryphal. All of these interventions have been around for millennia, all invited sepsis, and all were unquestionably painful.

Modern surgical anesthesia dates only from the middle of the 19th century, but efforts at controlling surgical pain can be traced at least as far back as pre-Islamic Iran where cannabis and camphor were said to be used perioperatively. Alcohol was also widely used, as was opium. In the first century, the emperor Nero’s surgeon recommended the root of Atropa mandragora – a source of atropine – boiled in wine. In the 18th century it was recommended that tobacco smoke be introduced to the rectum, supposedly giving rise to the expression ‘blowing smoke up one’s ass.’

But pain control is not anesthesia.

In 1845 the New York Daily Tribune published a detailed account of an amputation. The operation took place at New York Hospital, a five-acre nest of low brick buildings, located on what is now Lower Broadway. The patient was a young man, cradled tenderly the whole time by his father and at the same time held firmly – and brusquely – in place by the attendants. As the surgeons – there were two – made their cuts, the boy’s screams were so full of misery that everyone who could left the room. The first part of the operation complete, the young man watched “with glazed agony” as the chief surgeon pushed a saw past the sliced muscles, still twitching, and listened as the blade cut through the bone in three heavy passes, back and forth. That was the only noise in the room, for the boy had stopped screaming.

— Fenster, 2001, Ether Day

Just a year later in 1846, the first surgical anesthetic using ether was demonstrated by the American dentist William Morton** at Massachusetts General Hospital. News of the advance spread like a prairie fire.

New and Valuable Discovery. We noticed yesterday the discovery of a new preparation by Dr Morton which is intended to alleviate the sufferings of those who are forced to undergo painful operations in surgery and dentistry, as well as to facilitate the work of operators (surgeons). The effect of this new discovery is to throw the patient into a state of insensibility and while unconscious any operation can be performed without occasioning pain. We are told by a gentleman of the highest respectability that he witnessed an experiment of the use of this most extraordinary discovery at the rooms of Dr Morton one evening this week. An ulcerated tooth was extracted from the mouth of an individual without giving him the slightest pain. He was put into a kind of sleep, by inhaling a portion of this preparation, the effects of which lasted for about three quarters of a minute, just long enough to extract the tooth. This discovery is destined to make a great revolution in the arts of surgery and surgical Dentistry.

Boston Evening Transcript, October 1, 1846

Surgery would no longer be a test of the patient’s ability to withstand pain, nor the surgeon’s ability to withstand inflicting it. After witnessing one of Morton’s surgeries employing ether, Oliver Wendell Holmes Sr. suggested the word ‘anesthesia,’ from the Greek ‘anaisthesis’ (insensate), for the process of rendering a patient unconscious during painful procedures. The seed of a new medical specialty had been planted.

Ether as an anesthetic was joined by another agent a year later when Scottish obstetrician James Y. Simpson began using anesthesia during childbirth. Simpson had begun with ether but was disturbed by its persistent odor and its tendency to irritate the bronchi. He experimented on himself and his associates with, among other agents, acetone, benzene, and, the one he found most suitable, chloroform.

As an interesting historical aside, Calvinist clergy in Simpson’s native Scotland objected to Simpson’s use of chloroform arguing that the pain of childbirth should be endured with patience and fortitude. One wonders if the exclusive maleness of the Calvinist clergy might have had some bearing on their position. In any event, that issue dissipated when in 1853, Queen Victoria assented to chloroform during the birth of her eighth child.

It is difficult from the perspective of the twenty-first century to comprehend just how momentous the advent of surgical anesthesia was. Surgery had been a race to complete the procedure in the shortest possible time. The pain was unimaginable. The difficulties of performing surgery on a writhing, screaming patient were monumental. Yet as important a breakthrough as anesthesia was and despite all of the benefits it brought, there were concerns about patient safety from the beginning. In 1861, Lente wrote:

Greater or less danger is inseparable from the administration of every powerful agent in the Materia Medica, however cautious and skillful the practitioner by whom it is employed; nor can we reasonably expect that an agent, so powerful as in a few minutes to render the body insensible to the pain of a torturing operation, shall be entirely exempt from risk. By what means, then, can we reduce this risk to its minimum? One tells us that it is to be effected by the use of his inhaler; another by his; another by some peculiar arrangement of the sponge or towel so as to insure a due admixture of atmospheric air, or by regulating the successive doses of the agent. But still the fact stares us in the face that the number of deaths is not diminished. [emphasis added]

In these early days the administration of anesthetics was fairly crude. Ether or chloroform was dripped onto a sponge or towel inside a usually glass vessel with a mouthpiece on one end and an opening to room air at the other.

From the Wellcome Trust Image Librar via the Wikimedia Commons

Modern copy of the original inhaler used to administer chloroform
From the Wellcome Trust Image Librar via the Wikimedia Commons

The patient would inhale through the mouthpiece, drawing room air across the sponge where it would mix with ether fumes. There was no supplemental oxygen, only room air, and the inspired concentration of ether varied with tidal volume, ambient temperature, the size of the sponge and its saturation. As transformative as it was for patients facing surgery, anesthesia was risky business.

Lest the general public who may read this may take alarm and exaggerate the dangers of ether, let me say at once that the deaths are estimated in various statistics as being only one death in 4533 administrations in America (Gwathmey), one in 5112 in Germany, one in 16,302 in Great Britain, and even only one in 50,000 (Rovsing).

W. W. Keen, M.D., LL.D. 12/2/1915

That it was considered acceptable, even laudatory, that 1 in 4,500 anesthetics resulted in death should be some indication of the horrors and dangers of surgery without anesthesia.

Keen went on to say:

In this city and this hospital my topic – The Dangers of Ether as an Anesthetic – may at the first blush seem ungracious. But our profession ever seeks the unvarnished and untarnished truth. To recognize that there are dangers is the first step in eliminating them. When life is at stake ignorance is not bliss. Forewarned is forearmed.

There were certainly plenty of dangers to recognize. From the perspective of modern pharmacology, ether is a relatively safe anesthetic agent as it exhibits minimal cardiac and respiratory depression. But overdose was always a concern in those early days when ether was dripped on a handkerchief or given with a crude inhaler. Moreover, ether is associated with laryngospasm – a constriction of the vocal chords that can occlude the airway and lead to asphyxia. Nausea and vomiting are common after ether anesthesia and the potentially lethal aspiration of vomitus was an ever-present danger. But overdose, “over-inebriation” as it was then known, was the greatest concern.

[The patient was] inhaling, in the continuous way that was at first supposed to be essential to protracted insensibility, through a glass glove of ether, and long after insensibility was manifested. The operation was far from completed, when a bystander happened to feel the pulse. There was no special reason for doubt, inasmuch as the patient was, in general appearance, like all former thoroughly etherized patients. The pulse proved to be barely perceptible, and the patient to be etherized almost beyond recovery. The bystander, after repeated observation of other cases, published the fact, then first observed, that in ether anaesthesia the pulse stood as a beacon between safety and danger, between harmless inebriation and fatal narcotism. This was the discovery that ether was not dangerous; because this showed that its danger gives warning, and is under control. (Clarke & Bigelow, 1876, p. 179)

The ‘bystander’ in this recollection was the eminent surgeon Henry J Bigelow, the author of the article quoted. In the first days of ether anesthesia, monitoring of the patient’s vital signs was entirely incidental. With Bigelow’s observation came the first introduction of rigor in the delivery of anesthetics. Monitoring of patients’ vital signs moved from observation of pallor and lip hue to structured records noting times, conditions, and basic vital signs. In the early days of the 20th century, blood pressure readings joined heart and respiration rate in the patient record.

Anesthesia developed rapidly. Nitrous oxide, not as effective an anesthetic as ether or chloroform, was nonetheless an important adjunct (and unlike ether and chloroform is routinely used to this day). Early attempts to employ it were limited by the expense of equipment and technical expertise necessary to produce it in sufficiently pure form. Eventually nitrous oxide was compressed and bottled, making its use inexpensive and ubiquitous. Oxygen too was compressed and bottled and new and better anesthesia equipment was developed that allowed the clinician to establish flows of oxygen, nitrous oxide, perhaps other gases, and eventually the vapors of one or more inhalation anesthetics.

The importance of supplemental oxygen and equipment to deliver it can’t be overstated. The air we breathe is mostly nitrogen mixed with about 21% oxygen. Less than 21% oxygen and the gas is considered hypoxic. Breathing hypoxic mixtures for more than brief intervals can lead to brain damage and even death, and indeed was a significant cause of death from early anesthetics.

When an anesthetic mixture relies on atmospheric air, it begins limited to 21% oxygen. As other agents such as nitrous oxide and anesthetic vapors are added, the oxygen percentage is reduced, as is the total amount of oxygen inhaled. But with the advent of bottled oxygen and equipment to mix it with the other gases, the inspired oxygen concentration could easily be maintained at or above 21%. By 1917 the basic form of the modern anesthesia machine began to take shape.

For all of its charms, ether was not entirely benevolent. Many people suffered significant vomiting during recovery and, as mentioned earlier, laryngospasm during anesthesia. Chloroform, an alternative, had a narrow window between anesthetic efficiency and death and a nasty propensity for triggering heart arrhythmias. The search for better anesthetics yielded cyclopropane, a compressed gas like oxygen and nitrous oxide, making it easy to incorporate into anesthesia machines. As one early user observed:

My conception of anaesthesia with older gases is that we administer the gas, plus enough oxygen to keep the patient alive and in good condition. With cyclopropane, on the other hand, we administer oxygen with just enough of the anaesthetic gas to keep the patient asleep.

Cycloprane became increasingly popular in the 1930s and, with better monitoring, better anesthesia machines, better drugs, and better anesthetists, the death rate was less than 1 per 1,000 surgeries. You will note that this appears to be a higher mortality rate than at the dawn of anesthesia when it was perhaps 1 in 4,500. But by the late 1930s, much longer and more invasive surgeries were being performed meaning that patients were kept anesthetized for much longer periods, and much better records of fatalities were being kept. In any event, anesthesia remained a dangerous undertaking.

Risk of overdose and hypoxia were bad enough, but ether, chloroform, and cyclopropane share a characteristic beyond being effective anesthetics – they are all highly flammable, and exceptionally so in the presence of high oxygen concentrations.

A Spanish family brought their twelve-year old boy to the hospital, his lung condition making an operation necessary. The operation was proceeding normally when through a cause we were never able to trace, the glowing cautery set light to the ether vapour used as the anaesthetic. The violent explosion that followed, was repeated almost immediately as an oxygen cylinder blew up. The patient was killed on the spot, the sister and the assistant were injured, and I lost one ear-drum.
Ferdinand Sauerbach

Almost anything would trigger a fire or explosion: candles before electric lighting was widespread, switching electric lights on after they came to use, hot irons used for cautery, even static electricity caused by rolling an anesthesia machine across a floor. The situation only grew worse with the advent of medical electronics. By the early 1960s Electronics for Medicine (E for M) introduced the ORM-1 EKG monitor for operating rooms. Real time intraoperative electrocardiography was a revolutionary addition to the surgical suite but the monitor had to be built inside a sealed metal housing to prevent sparks from triggering explosions. The elaborately-housed ORM-1 was affectionately called the ‘torpedo.’ The Wood Library Museum of the ASA declined my request to use their photograph of an ORM-1 in this essay but you can see it here.

Despite hypoxia and overdose and fires and explosions, surgery with anesthesia was vastly more attractive than surgery without it. As anesthesia progressed from an ether-soaked rag to the modern anesthesia station, the driving force was physicians who were not content with the status quo ante of anesthetic care. Patient monitoring progressed from the observation that the patient hadn’t died, to a detailed patient record with contemporaneous recording of vital signs and drugs administered. New anesthetic drugs were developed that weren’t flammable. A range of pharmaceuticals emerged that anesthetists used to induce anesthesia, regulate circulatory system activity, paralyze muscles, and control postoperative pain. By about 1980 the anesthesia mortality rate in first world countries was less than 0.35 per thousand, half the mortality rate just thirty years earlier.

A paradigm shifts

From one perspective, a good deal of science can be seen as a reactive enterprise: an observation is made that triggers further exploration. The very practice of anesthesia began that way. The observation that surgery was too excruciating to last more than a few minutes led to the search for the first anesthetics. Observations of shortcomings of early drugs led to exploration for better ones. Observations of the dangers related to equipment led to better equipment and more equipment. Improvements in the ‘stuff’ of anesthesia were accompanied by improvements to the practice of anesthesia, to greater professionalism in that practice. All of these were driven by the desire to make anesthesia better and safer, but that drive was based on observations of deficiencies that occurred so frequently that they demanded address.

During the Second World War, a psychologist named John C. Flanagan was tasked with developing tests to identify candidates suitable for combat missions. This led to his study of human behaviors and their consequences in specific activities and, in 1954, publication of The Critical Incident Technique. Today Flanagan’s work informs everything from aviation safety to quality control in manufacturing.

Twenty-five years later, Jeffrey Cooper, a biomedical engineer at Massachusetts General Hospital, and his colleagues used this technique to study how human factors contributed to anesthesia errors. The perspective was beginning to change from reaction, to an accumulation of instances of some deficiency, to an organized program to identify the behaviors that impact – positively or negatively – anesthesia outcomes. The Critical Incident Technique would come to drive innovations in training, equipment design, even equipment to be used during the administration of anesthesia. Here was a tool that would help to prevent anesthesia problems before they had accumulated into a body of tragedies that triggered efforts to correct. The science was moving from reactive to proactive.

This is not to say that Cooper’s work was immediately embraced. The human factors that The Critical Incident Technique studies explicitly questioned were the behaviors of anesthetists, saying in short that the face in the mirror brushing her teeth might be a big part of the problem. That is a difficult proposition for someone with years of training and a deep commitment to professionalism to accept.

On April 22, 1982, ABC television’s news magazine 20/20 aired “The Deep Sleep: 6,000 Will Die or Suffer Brain Damage” and it rocked the anesthesia world:

If you are going to go into anesthesia, you are going on a long trip and you should not do it, if you can avoid it in any way. General anesthesia is safe most of the time, but there are dangers from human error, carelessness and a critical shortage of anesthesiologists. This year, 6,000 patients will die or suffer brain damage.

20/20

Anesthesiology was facing a malpractice insurance crisis in the 1970s and 80s. Anesthesiologists accounted for about 3% of physicians and about 3% of malpractice claims, but a whopping 12% of liability insurance payouts. Some anesthesia mishaps were minor, for instance a tooth chipped because of poor laryngoscope technique. But generally, anesthesia mishaps were likely to result in death, nerve, or brain damage. Many anesthesiologists were arguing for tort reform to limit malpractice payouts and to blunt malpractice insurance costs. But visionaries in the field held that the practice of anesthesia could be made very much safer.

Ellison “Jeep” Pierce, MD, had developed an interest in patient safety in 1962 when, as junior faculty at Harvard, he was assigned to give a lecture to residents on “anesthesia accidents.” He maintained his interest over the years, compiling a fat file of clippings and notes and in 1983, while first vice president of the American Society of Anesthesiologists, led that organization to create the Committee on Patient Safety and Risk Management with the goal of better understanding the causes of anesthesia mishaps. Richard Kitz, an anesthesiologist at Mass General, grasped the importance of Pierce’s Committee and suggested to him an international meeting on patient safety. That meeting, the International Symposium on Preventable Anesthesia Mortality and Morbidity, led to the creation in 1985, of the Anesthesia Patient Safety Foundation. Unique in its time, APSF brought together a broad range of stakeholders including nurses, regulators, manufacturers, attorneys, and others with the express goal that “no patient shall be harmed from anesthesia.” Not beholden to any one particular institution, APSF was free to explore controversial subjects and to follow research wherever it led. Perhaps most importantly, it cultivated a culture of patient safety that grew to permeate the practice of anesthesia.

Kitz, the anesthesiologist who suggested the symposium, was the chair of Jeffrey Cooper’s department at Mass General. The Critical Incident Analysis that Cooper championed, along with root cause analysis, became important tools in understanding adverse anesthesia outcomes and building strategies to minimize them. As the underlying causes of anesthesia mishaps became clear, new monitoring devices for inspired oxygen, oxygen saturation in the blood (SAO2), and changes in carbon dioxide content of respired gases (EtCO2) became standard equipment wherever general anesthetics are delivered. Simulators were developed that allowed clinicians to develop specific techniques for managing, for instance, difficult intubations and for practicing critical event management. Programs such as the Closed Claims Project continue to explore anesthesia mishaps for lessons that can improve anesthesia safety.

In 1999 the Institute of Medicine released To Err Is Human: Building a Safer Health System.  The report notes the focused attention of the anesthesia community:

Patients who died during surgery requiring general anesthesia have been the focus of many studies over the last few decades. Anesthesia is an area in which very impressive improvements in safety have been made. As more and more attention has been focused on understanding the factors that contribute to error and on the design of safer systems, preventable mishaps have declined.45–48 Studies, some conducted in Australia, the United Kingdom and other countries, indicate that, today, anesthesia mortality rates are about one death per 200,000–300,000 anesthetics administered, compared with two deaths per 10,000 anesthetics in the early 1980s.49 The gains in anesthesia are very impressive and were accomplished through a variety of mechanisms, including improved monitoring techniques, the development and widespread adoption of practice guidelines, and other systematic approaches to reducing errors.50″

Today organized efforts to improve outcomes and safety can be found in nearly every medical specialty. There are federal agencies such as the Agency for Health Care Research and Quality and independent organizations such as the National Quality Forum, all working to make medicine safer and more effective. But apparently this deep commitment to patient safety is not shared by the CAM community. I tried to find similar organizations or at least organized efforts within naturopathy, chiropractic, and acupuncture, but came up empty. Perhaps my google-fu needs work.

Conclusion: Science is hard, worthy work

We take from this a number of important lessons. The advent of etherization and the dawn of anesthesia transformed surgery from a race against time for the surgeon and a gruesome and blood-curdling exercise in surviving unimaginable pain for the patient, to a largely painless procedure offering the surgeon ample time to explore, dissect, and repair a variety of problems. But with that incredible discovery came a set of quite specific risks, each requiring careful study and concerted effort to abate. A single discovery can have a major impact, but in the wake of that discovery new questions arise and in exploring those questions new insights are gained and new discoveries are made. That is how science works.

A common refrain of quacks and their enablers accuses science of ‘not knowing everything.’ And it doesn’t. Science is not a destination, it is a process. Scientists do not make a discovery, declare victory, and go home. Each insight adds a pixel or two to a very large and complex picture. Each one suggests new questions and new avenues of exploration. But that ‘science doesn’t know everything’ is not the same as science not knowing anything. Science knows vastly more than any other system of physical knowledge has ever devised. In particular, medical science learns more in any given month than all of the various medical quackeries combined since the beginning of recorded history.

We can also take the lesson that there is much more to science than the occasional strokes of genius. Many in the quack community paint themselves or their gurus as solitary mavericks with earth-shattering discoveries that confound narrow-minded scientists. They never are. Ernst T. Krebs, a serial failure at all things academic, was held as a maverick genius for ‘discovering’ Laetrile, once proposed as a cure for cancer. It wasn’t. Stanislaw Burzynski has been glorified as a maverick genius by film maker Eric Merola and a legion of hopeful cancer sufferers for his “revolutionary” anti-neoplaston therapy to cure cancer. He isn’t. Even the most generous would not have termed Morton a genius. His genius was not in discovering ether (first synthesized in 1540). His genius was not in recognizing that inhalation of ether rendered one insensible (the recreational use of ether for that purpose, so-called ether frolics, had been around since the 18th century). His genius was not even in using ether as an anesthetic. Others including Horace Wells, Charles Jackson, and Crawford Long either used ether for that purpose or discussed its potential as an anesthetic. William Morton’s genius, if indeed any could be claimed, was in advocating for ether’s use as an anesthetic, demonstrating its efficacy in that role, and tirelessly lobbying the US Congress to recognize him – and compensate him handsomely – as the discoverer of etherization, a goal he never achieved.

Science is not so much dependent on the flash of brilliance of lone mavericks as it is on the combined effort of thousands of scientists and technicians, each working alone or in groups in pushing back the darkness a millimeter at a time. Some labor an entire career with their contributions only understood by a handful of others. But in aggregate, those millimeters multiplied by the thousands at work keep science moving at an astonishing pace. And when individual flashes of brilliance translate into huge leaps of scientific advancement, it is important to remember that those leaps would not be possible without the careful work of all those who laid the foundations. The story of anesthesia, its discovery, and its long, slow evolution to the incredibly safe and effective science it is today, stands in testament to some of those thousands, most of whose names we will never know much less remember.

The last lesson is that quacks are wrong when they claim that physicians don’t treat the whole (hol?) patient. Early etherization revolutionized surgery, but that revolution came with some costs. Physicians and scientific researchers developed better equipment to deliver better anesthetics, better monitoring to assure safer anesthesia delivery, and better incident analysis to assure continuous improvement of both the quality and the safety of anesthesia. The ‘symptom’ of surgical pain was ‘cured’ by etherization in 1846. Yet science-based medicine continues to refine the anesthetic experience to deliver the best and safest outcome for the whole patient.

The story of the commitment of medical science to safe and effective treatments is vast. Today’s essay chose anesthesia as the exemplar of the history of and continuing commitment to better health care. Oncology or surgery or radiology or any of dozens of other disciplines would have served just as ably.

Imagine that the only anesthesia available for removal of an appendix or even an impacted wisdom tooth is acupuncture. Imagine that leukemia can only be treated with vitamin C infusions and homeopathic nostrums. Imagine radiology used only to image subluxations that are imagined to impede the nerves and result in liver disease or kidney disease. This would be medicine without science. Some of this was medicine without science just a century ago. The embrace of quackery by institutions great and small, quackademic medicine, means some of this is considered medicine today and perhaps more of it will be considered medicine tomorrow. This should not be the legacy we leave. This is yet another reason why science-based medicine matters.


Author’s Note: The history of anesthesia’s march from decidedly mixed blessing to nearly universally safe and effective care could easily fill a substantial book replete with interesting – sometimes eccentric – characters and compelling, occasionally hair-raising, stories. Seventy-five thousand words would probably do it. But for blog essays, I try (and here have failed) to keep to a limit of 1 Gorskon (1 Gorskon = approximately 4,000 words). So the narrative has necessarily skipped like a flat stone over still water, touching only a few high spots to illustrate the main argument and carry us to the conclusion.

I am deeply indebted to Robert K. Stoelting, MD for sharing a prepublication chapter of Quality and Safety in Anesthesia and Perioperative Care and for his generous suggestions of other resources for this essay. Dr. Stoelting is the author of several textbooks including the classic Handbook of Pharmacology and Physiology in Anesthetic Practice. Dr. Stoelting is the soon-to-retire president of the Anesthesia Patient Safety Foundation.


*I will not use the newly preferred term of Integrative Medicine as it is little more than an Orwellian distortion of language meant to obscure rather than to elucidate. What is to be integrated is the unproven, the disproved, and the nonsensical. Stripping away these integratees leaves the proven and that is not in need of integration with anything else. The proven is science-based medicine.

**In fact there is good evidence that Crawford Long, MD, used ether anesthesia in his Georgia surgical practice beginning in 1842, but Long didn’t publish and history generally credits Morton.

 

 

Shares

Author

  • James Thomas was the managing editor of Alabama Sun magazine and has worked with manufacturing companies in the United States and abroad.

Posted by James Thomas

James Thomas was the managing editor of Alabama Sun magazine and has worked with manufacturing companies in the United States and abroad.