Articles

Cell Phones and Brain Tumors

The question of whether or not there is a link between the use of mobile phones (also called cell phones) and the risk of brain tumors has been cropping up more and more frequently in the media – every time a new study or analysis comes out. This is a very important question of public health as cell phone use is becoming more common, and brain tumors are a very serious and often life-threatening category of diseases.

Of course such questions are best answered by a dispassionate, careful, and systematic look at the science – what is the plausibility of a link and what is the evidence that there actually is one. At this point we are somewhere in the middle of studying this problem. We already have substantial data, but it is conflicting and the research community is still debating on how to get more definitive data everyone can agree upon. So at present there is a variety of opinions on the matter. The consensus seems to be that cell phones probably do not cause brain tumors, but we’re not sure, there is meaningful dissent from this opinion, and so more study is needed.

There are two types of scientific studies we can do to answer this question. The first is biological and looks at the effects of radiation, and specifically the type and strength of radiation emitted by cell phones, on cells in a test tube and on animals. This will tell us if a risk from cell phones is plausible, if there is a mechanism, and what, if any, the effects are likely to be. But this kind of data will not tell us if cell phones in fact have caused or are causing brain tumors.

The second kind of scientific evidence is epidemiological, which looks for a correlation between cell phone use and brain tumors. Epidemiological studies look more directly at the actual question, but have their own complexities and limitations. It is difficult to impossible to perfectly isolate the variable of interest (cell phone use) and measure its correlation to another variable (brain tumors). Further, as soon as you start asking the question a miriad of sub-questions emerge:

How much cell phone use is necessary to cause a brain tumor? This further breaks down to the variable of intensity of the radiation, frequency and duration of exposure, and duration of total exposure, and the delay from exposure to tumor appearance. And further, is there an apparent dose-response – does a higher does of cell phone radiation cause an increased risk of tumors?

What kind of tumors are we talking about? Does use correlate with an increased risk of benign tumors, all tumors, only malignant tumors? Does is correlate with tumors on the side of preferred cell phone use, or anywhere in the brain? What if cell phone use increases the risk of getting cancer from other causes, but is not a risk by itself. In other words, maybe smokers get more smoking-related cancers if they also use cell phones.

Then there is the issue of sub-populations. Are children more at risk? What about men vs women, or those with compromised immune systems?

There are also different ways to look at the data. We can begin with a population of cell phone users and another population of non-cell phone users, follow them over time and keep track of who gets what kind of brain and other tumors. (This is called a cohort study). Or, we can look at 100 people with brain tumors and then quiz them about their prior cell phone use. Or you can do a population study by surveying a large population for a number of variables, like cell phone use and brain tumors, and then do a statistical analysis to look for correlations.

And here we begin to see the problem with epidemiological studies. They provide very useful information, but variables can multiply endlessly forever muddying the waters. A pattern can emerge, however, after multiple different types of epidemiological studies are completed on different populations with varying methods. A consistent pattern favoring a lack of correlation, or a positive correlation, can emerge and become highly reliable. Although it must be pointed out that such studies cannot prove, by definition, the absence of any correlation. They can only set statistical limits on the probable maximum size of any such correlation. A correlation smaller than the power of the studies to detect is always possible.

So what does the current evidence say about cell phone use? The biological studies have largely been negative, although some studies have shown changes in cells or their genes after prolonged exposure to cell phone radiation. However, the exposures were greater than what would occur with even frequent cell phone use, so the utility of these studies are questionable.

The epidemiological evidence can best be described as “mixed.” In other words, there is no strong signal, no strong correlation between cell phones and brain tumors. Neither, however, has any correlation been adequately ruled out. We are still in that pesky “we need more data” phase. Here is the FDA summary of the evidence so far.

A recent meta-analysis suggested that there may be a small increase in risk for certain kinds of tumors only in those with exposure for greater than 10 years. I do not put a great deal of faith in meta-analyses. They have their own problems. I prefer systematic reviews. But sometimes they give a snap shot of the current literature on a specific question.

This meta-analysis also, however, was published before an even more recent, and very large, UK study that found no association between cell phones and tumors. That’s reassuring, but the literature is likely to go back and forth like this for a while. Eventually, all of the criticisms and short comings of prior studies will be used to design a few very large and fairly definitive studies, and then a firmer consensus will likely emerge.

Recently a Neurosurgeon by the name of Dr. Vini Khurana has published on the web the results of his systematic review of the literature (often misleadingly referred to by the press as a new “study”). He concludes that the evidence is trending toward the conclusion that there is a correlation between cell phone use and brain tumors, but only for exposure durations of >10 years. He dismissed much of the negative evidence because they studied primarily exposure durations less than 10 years. His analysis has yet to be peer-reviewed (by report this is in process).

For now, we remain hopeful but cautious. For those who want to err on the side of caution, there are some reasonable recommendations (these come from multiple organizations, so they seem to represent a consensus).

- Limit your cell phone use

- Do not allow small children to begin using cell phones.

- Use a head set to increase the distance from the antenna to your head.

Why not allow kids to use cell phones? This is purely speculative at this point, but the fear is that their thinner skulls will allow more radiation to pass through, their smaller brains will not dissipate the heat as well, and their immature development stage will make them more susceptible to any biological effects. All plausible, but unproven. Studies specifically looking at kids are on the way, but no data yet.

Of course, like any scientific or health issues these days, there is a layer of pseudoscience piled on top of this question. There is one notable crank, Arthur Firstenberg, who has been ranting for years about the evils of cell phones and other wireless technology. His writing reads like classic conspiracy-based fear mongering, with a distinct aftertaste of crank. He quotes numerous dubious scientific claims about cell phones without ever providing proper references, but of course because big industry is hiding what they have all known for nearly a century. Bottom line – don’t believe the hysteria.

There is also a cottage industry of entrepreneurs who would love to sell you a device that protects you from the cell phone radiation. These devices tend to fall into one of three categories: 1) pure magic, like crystals; 2) sound technologically but have no effect; or 3) they actually shield cell phone radiation, but at the expense of the wireless signal that makes them work. So far no one has figured out a way to shield against cell phone radiation without shielding against cell phone radiation.

But just because there are some fear-mongering or greedy pseudoscientists out there does not mean that the claims can be dismissed. What the evidence shows is that there is biological plausibility for a negative effect; the epidemiological evidence for any correlation with <10 years of exposure is mixed but leaning negative, and for >10 years of exposure is mixed but leaning positive. I think we can rule out a strong correlation (meaning a large risk), but not a small one. It is reasonable to caution about cell phone use in kids until we get some data either way. And we need more data all around before the question can be put to bed.


Posted in: Neuroscience/Mental Health, Public Health

Leave a Comment (34) ↓

34 thoughts on “Cell Phones and Brain Tumors

  1. mugwump says:

    We may’ve lucked out on this question. If I recall correctly, something like 90 percent of the Italian population has a cell phone. Assuming their rate of brain cancer is similar to other countries, in a few years, say, a decade to be completely sure, we can compare their cancer rates to other countries and see if there’s a significant difference. It will take a while, but with a sample size so large the results will be fairly ironclad.

  2. qetzal says:

    Nice post. I’m a bit curious, though. First you said:

    The biological studies have largely been negative, although some studies have shown changes in cells or their genes after prolonged exposure to cell phone radiation. However, the exposures were greater than what would occur with even frequent cell phone use, so the utility of these studies are questionable.

    That doesn’t seem to support your concluding comment:

    What the evidence shows is that there is biological plausibility for a negative effect….

  3. apteryx says:

    It’s quite common for in vitro or animal studies to have to use exceptionally high doses of an agent to produce a side effect quickly enough, and in large enough numbers, to be statistically significant. Such results provide modest cause for concern that if humans get a smaller dose over 50 years or so, some smaller percentage of them might suffer the same effect. Of course, the dose-response curve may often be nonlinear (occasionally, provably so), and the low dose may be harmless. In vitro studies cannot provide evidence of harm in vivo; they are only evidence that the possibility of harm ought to be investigated. Let’s put it this way, if similar in vitro studies suggested that the emanations from some CAM device were genotoxic at very high doses, would you say that there was no biological plausibility for any negative effect and no point in asking further questions about it?

  4. daedalus2u says:

    apteryx, there is no biological effect that is actually linear. A linear dose-response correlation is chosen because it is the simplest dose-response correlation.

    The technology to shield electromagnetic radiation is well known. The simplest and cheapest is known as the “aluminum foil cap”. Aluminum foil provides virtually 100% shielding at the wavelengths used for cellphones (provided there are no holes).

    Of course that would also shield natural electromagnetic radiation which may (or may not) have biological functions.

    Regarding potential adverse effects on children, I would put heat dissipation at pretty low potential for adverse effects. The region of the body that is most sensitive to microwave heating is the transparent solid parts of the eye where there is no blood circulation. Blood flow is what dissipates heat and there can’t be blood in the lens and cornea because they need to be transparent to wavelengths that blood is opaque.

    I suspect adverse effects in children (if there were any) would relate more to neurodevelopment and axon tracking. Detecting subtle effects in the presence of very large natural variation would be extremely difficult even with prospective studies with deliberate high and known exposures (which would be unethical).

  5. Godless Geek says:

    I have a very difficult time accepting that cell phone radiation causes cancer. The transmitter power on a cell phone is about 1 W. You have stuff way stronger than that hitting your head every day with no ill effects. The stuff that comes out of a cell phone is also just naturally low energy. We’re talking about signals in the 800-950 MHz range. If you have a cordless phone in your home, it likely operates in the 2.4 GHz range, possibly even higher. EM radiation doesn’t begin to ionize until it gets over about 3000 THz, and it doesn’t cause damage to DNA until that point. Before that, the only chance it has of damaging it is by heating it to the point that it basically gets cooked, and we’re talking about energy levels way to low. Microwave ovens operate a 2.45 GHz at powers of 1 MW or greater to produce enough energy to cook with. 900 MHz at 1W isn’t even going to make it past your skin.

  6. Godless Geek says:

    Ok…one mistype in my last reply. Microwave powers are 2.45 GHz at 1 KW, not 1 MW.

  7. apteryx says:

    I am aware that there is little evidence to suggest that linear dose-responses are common in nature, but that is what a lot of alleged experts assume to be the case, and if you do not pay at least lip service to the concept by dismissing it politely, someone will turn around and yell that OF COURSE anything dangerous at high levels is “risky” at low levels also.

    Are you seriously suggesting the employment of tinfoil hat technology? That would have looked better yesterday. ;-)

  8. qetzal says:

    apteryx asked:

    Let’s put it this way, if similar in vitro studies suggested that the emanations from some CAM device were genotoxic at very high doses, would you say that there was no biological plausibility for any negative effect and no point in asking further questions about it?

    No, but that’s not what was stated. The statement was that most studies have found no effect, but a minority have showed “changes in cells or their genes” after prolonged exposure to high levels. If that’s a fair characterization of the in vitro studies, then I think it’s going a bit far to claim “biological plausibility for a negative effect.”

    OTOH, maybe (OK probably) I’m just being too nitpicky and semantic.

    :-)

    P.S. There was a post at Effect Measure a while back on a study that claimed to show changes in protein expression in the skin of human volunteers in response to cell phone radiation. However, after looking at the details, it was clearly a poorly designed study, and did not demonstrate what the authors’ claimed (as I tried to explain to Revere in the comments of that thread).

    I’m not so interested in this topic that I’m willing to really dig through the literature. However, if anyone knows of a good study that shows the potential of cell phone radiation to have harmful effects, I’d be grateful for the link/ref.

  9. apteryx says:

    If you can generate genetic damage in properly controlled in vitro studies, immediately you have biological plausibility for a negative effect. It is also biologically plausible that there is no negative effect. Plausibility is not certainty. It just means that the hypothesis that there could be an effect is not inconsistent with known science.

    IMO, the biggest health risk associated with cell phones is the risk of being run down by idiots yapping on cell phones while driving SUVs. However, this is a relatively new technology and it’s fair to ask what else it might do to us. I don’t subscribe to the extreme version of the precautionary principle that would not allow any new technology on the market until it has been tested for every conceivable risk. (Although I do think that plants not previously consumed by humans should be tested in animals first.) Let it be marketed for those who wish to utilize it, but at some point check to see if the guinea pigs (I mean early adopters) are suffering any harm from it. Some people seem to assume that if it’s a shiny new corporate product and popular with the upper classes, there surely can’t be any harm in it.

  10. qetzal says:

    Yes, IF you can show genetic damage or other clear deleterious effects (not just “changes”), and IF you can show it reproducibily under a given set of conditions (not just in a minority of studies).

    OTOH, if you don’t see clear deleterious effects, or if some studies claim to see them but others don’t (under similar conditions), you have not yet demonstrated biological plausibility.

    So which is it? Are there reproducible studies showing deleterious effects in vitro?

  11. overshoot says:

    Nonlinear effects? Absolutely. There’s at least one that is guaranteed highly nonlinear, in fact, and that’s heating. The effects of added heat load on tissues is pretty nearly exponential thanks to the whole activation energy relationship, after all.

    Put another way, it’s a safe bet that you won’t get one ten-thousandth the damage from 100 mW that you will from a kilowatt; the latter is well past the LD100 for ten minutes of exposure and the latter is sustainable at negligible morbidity for days or longer.

    OK, so it’s a trivial example but still a useful one, especially in context.

    My own take is that this is another instance of rationality being sacrificed in the pursuit of a risk-free society. I’m fine with identifying significant risks, but the key word there is “significant.” I have gotten to the point of making myself walk away from people who get apoplectic over the unconscionable risks we’re exposing their precious darlings to (thanks to the scare du jour). Meantime they’re running off with those same darlings, unbelted, to Mickey D’s for dinner (a five-minute drive in rush hour) because cooking dinner would interfere with the Pop Warner game.

    Don’t even mention the subject of “acceptable risk” to them, though, because of course no risk is acceptable where their children are concerned.

  12. apteryx says:

    Well, this is one of those issues about which I know very little (both because I don’t think it’s a big health risk, and because I wouldn’t have a cell phone anyway if AT&T paid me), so I am not at all familiar with the literature. I am relying on authority, namely the fact that Dr. Hall, who is rather conservative, has seen the research involved and does not reject its admissibility as evidence. She dismisses the “utility” of the studies showing negative effects because they use high exposure doses, but apparently finds no reason to speculate that the data themselves are not real. (Contrast this to how she would describe the studies that purport to present data supporting the plausibility of “water memory.”)

    Qetzal, you say “if some studies claim to see [effects] but others don’t… you have not yet demonstrated biological plausibility.” This attaches the dismissive word “claim” to the publication of positive results, rather than to the publication of negative results, implying that the positive results might be imaginary or faked. What if we remark instead that some studies have “claimed to see no effects”? Again, I have no idea whether there have ever been any studies showing negative results under conditions identical to those that produced positive results. If there have been, though, it is not necessarily the positive studies that are of lower quality or in error, and either way, the likelihood is that researchers are reporting whatever results they have actually seen, given their experimental design.

  13. daedalus2u says:

    Aluminum foil as shielding for electromagnetic radiation is no joke. If you are worried about cell phone radiation, aluminum foil will block it.

    Extrapolation from high doses to low doses can only be done by using a model that relates the response to the dose. There are an infinite number of possible dose-response models that will fit any set of data. The usual preference is to choose the simplest one that fits the data and what is known of the actual physiology. Often that is done even when it is known that the underlying physiology isn’t linearly affected.

    In the case of tissue damage due to overheating, there are well-known threshold effects. Whole body exposure to 10,000 degrees above ambient will cause fatal injury in less than 1 second. Exposure to 10 degree above ambient will not cause fatal injury in 1,000 seconds (or ever). This is symptomatic of trying to extrapolate from regions of the dose-response space where you can measure effects to regions where you can’t.

    This is where knowledge of physiology is essential. We know that a major mechanism for thermal injury is denaturing of proteins. That has certain kinetics which depend on a lot of things, and is highly non-linear. It doesn’t happen inside a certain temperature range (depending on other parameters). We know it would be fundamentally wrong to extrapolate from a temperature regime where we can measure thermal damage to temperatures where we can’t and expect there to still be damage.

    Thermal damage from cell phone radiation in bulk brain cannot occur by any mechanism that is known. Everything that is known about thermal effects of microwave radiation, heat transfer, blood flow and denaturing of proteins would suggest that thermal effects to the brain from a few Watts of external RF radiation could not be significant.

    When you don’t know what is going on, assuming a dose-response relationship is problematic. Take ionizing radiation for example. Whether a cell is damaged or not depends on whether the ionizing radiation passed through the cell and whether the ions from that passage damage critical components such as a crucial gene that particular cell needs. Whether a particular cell is damaged by a particular quanta of ionizing radiation doesn’t depend on the “dose” it receives, it depends on the precise details of what the ions generated by the passage of that single quanta of radiation damage. There isn’t one possible “injury”, there are many tens of thousands depending on which part of which gene(s) get damaged. If the only genes damaged are not expressed by that cell, there may be no adverse effect.

    When cells divide, they need to use many genes not otherwise needed, and there are a number of “checkpoints” that stop cells from dividing if there is DNA damage. That is one reason that cells that typically don’t divide (such as nerve cells) are much more resistant to ionizing radiation than cells that divide frequently (such as blood stem cells and gut cells).

    It needs to be remembered that a “safe dose” is not a scientific question but rather a political one. There are necessary trade-offs of costs vs. benefits which are not “scientific” questions but value judgments. Much of the problem with addressing these political problems is that who gets the benefits and who bears the costs is not always the same individuals.

  14. Wallace Sampson says:

    OK, I’ll bite.

    My take is that this is a non-problem. Always a risk to dismissing of details – isometimes referred to as nits, but in this case the nits cannot be removed. What has to change is one’s level of tolerance for them.

    We went through a similar controversy 20 years or more ago during the power lines wars, at which time similar argments were being made, although the physics and electronics were different. But the associations between EMF from power lines and (name the disease) were equally as implausible. The idea generated in one epidemiologist’s head and her poorly done survey of calculated EMFs that non-one actually measured.

    Plausibility was close to nul. It took some 10 years and over 15 case control and population studies to put the matter to rest despite the absence of plausibility. The major problem in power lines being the long wavelength, which like the earth’s magnetic field, could not be shown to produce significant cellular changes.

    So there are several issues (unknowns) that have to be estimated and included in an equation before one can calculate a “solution” – in this case a probability.

    Here is where I may differ with others. Physical and biological plausibility is not just one of two ways of looking at the matter. The basic science and the epidemiology are not equal. So much is known about the physics of microwaves that it is much more significant a factor than the results of mixed methodology epidemiological studies that we know are inaccurate.

    In biological terms, we know microwave radiation produces no significant genetic changes. It adds energy to electrons, but expresses as heat, not as ionization or chemical linkages. Thus, one has to make one leap over the physics.

    Then one must make the derivative one from heat to carcinogenesis, which to me also either does not exist or is of such low plausibility that it can be dropped as being too many decimal places out.

    Second, is the nature of epidemiological studies. Not only does each study when perfectly carried out have to have a relative risk result exceeding control of 1.5-2 for significance, but collating results of multiple studies often will not settle things, but simply produce the same differences among studies – repeatedly. It’s the nature of the beast. One cannot expect narrower ranges of results unless one repeats the same study on the same population under the same circulmstances. And, perform perhaps ten times as many of them. That we cannot do, or at least should not attempt. There are more efficient ways to reasonble answer. I know, larger numbers of studies would narrow the statistical mean variation, but thatdoesn’t help me much because of the former limitations.

    Thus in this case, basic and biological science trumps epidemiology. Plausibility of carcinogenesis is not indeterminate at this point – it descends asymptotically toward zero.

    Third, one must allow for the biases of the researchers – the way they set up the study, selective reporting such as occurred with this latest press release that unfortunately made all the news and probably re-stimulated this discussion. Fabrication – unintentional and intentional – lives. And it thrives best on the sunny side of positive results, on the short road to headlines and grant money. The EMF controversy was fed by several in vitro papers that were later shown to have been made of fabricated data.

    Ignored at risk ofthe history. For instance, looking up echinacea, there never was a common folk use ffor colds and flus of ech. in Native American tribes …the idea gererated in the heads of an American eclectric practitioner of the 1800s and a Swiss naturopath who saw marketing possibilities in Europe – and did it.
    The cell phone brain tumor claim began with a talk show call-in listener whose wife developed a brain tumor and who got into his head that a cause must be found, and the only one nearby was their cell phone. That was it. There never was a study or an objective observation of any association.
    Ignore material like that and one can easily get lost in the indeterminacy of pseudoscience. Public money spent by the billions chasing non-causes of non-events.

    One may say that such bias is impossible to measure. OK. Nevertheless, the historical facts should make mplausible claims that much more scientifically implausible – much more unbelievable.

    The author of the recent report took a look at something not previously measured – association after ten years – and apparently found a small association – how much yet to be published. But even if there is a small significant association, it would not likely be more that noise in the data collection – that’s the nit. Data variations that are part of the system, and that do not measure the intended end point, but measure the inaccuracy level of the measuring system.
    Ioannidis incarnate.

    Extension: The latest review should have been accurately reported, and reporters and editors should have consulted with experts. Simplest question: Has there been a significant increase in brain tumors since invention and common use of cell phones? What? Don’t know? Look it up. I could not find one. My son did: “It’s on SEER. Up 1.5 percent per year between 1975-1987; down 0.3 percent per year 1987-2004.”
    A one percent increase of a low-incidence tumor – worst case, incidence 1/10/\5/yr for US would be about .3 – 3 cases per year – is in the “noise” range and not worth the effort now being expended. Any association should have shown by now, even if limitd to the ten year group. Another non-fact not needing a non-explanation.

    The significance is the smirch on the face of our news sources, which can be counted on to bring us biased news. Easy way out_ publish press releases without investigation. Something I have difficulty murmuring while watching or reading.
    WS

  15. qetzal says:

    apteryx,

    I didn’t mean to sound dismissive, and I’m certainly not implying that any positive results are a priori less trustworthy than negative results.

    Having argued over semantics this long, I decided to bite the bullet and look at the literature. The FDA Summary that Dr. Novella linked says:

    One kind of test, called a micronucleus assay, showed structural changes in genetic material after exposure to simulated cell phone radiation.

    Based on that, I searched PubMed for micronucleus AND fr. That returned 15 publications dating back to 1997. From the abstract, one was not relevant to this topic (#4 of 15, in case you want to judge for yourself). Of the rest, 10 found no association between RF exposure and markers of genotoxicity (#1-3,5,8 10-12,14,15). Four studies reported positive associations (#6,7,9,13).

    You stated:

    I have no idea whether there have ever been any studies showing negative results under conditions identical to those that produced positive results.

    Interestingly, one of the hits>/a> does exactly that:

    Speit G, Schütz P, Hoffmann H. Genotoxic effects of exposure to radiofrequency electromagnetic fields (RF-EMF) in cultured mammalian cells are not independently reproducible. Mutat Res. 2007 Jan 10;626(1-2):42-7.

    Conflicting results have been published regarding the induction of genotoxic effects by exposure to radiofrequency electromagnetic fields (RF-EMF). Using the comet assay, the micronucleus test and the chromosome aberration test with human fibroblasts (ES1 cells), the EU-funded “REFLEX” project (Risk Evaluation of Potential Environmental Hazards From Low Energy Electromagnetic Field Exposure Using Sensitive in vitro Methods) reported clearly positive effects for various exposure conditions. Because of the ongoing discussion on the biological significance of the effects observed, it was the aim of the present study to independently repeat the results using the same cells, the same equipment and the same exposure conditions. We therefore exposed ES1 cells to RF-EMF (1800 MHz; SAR 2 W/kg, continuous wave with intermittent exposure) for different time periods and then performed the alkaline (pH>13) comet assay and the micronucleus test (MNT). For both tests, clearly negative results were obtained in independently repeated experiments. We also performed these experiments with V79 cells, a sensitive Chinese hamster cell line that is frequently used in genotoxicity testing, and also did not measure any genotoxic effect in the comet assay and the MNT. Appropriate measures of quality control were considered to exclude variations in the test performance, failure of the RF-EMF exposure or an evaluation bias. The reasons for the difference between the results reported by the REFLEX project and our experiments remain unclear.

    If our data set were confined to these14 publications, I would strongly dispute any claims of a biologically plausible mechanism. Of course, they may be unrepresentative of the whole field, but I suspect otherwise.

  16. qetzal says:

    The PubMed search string should be micronucleus AND rf.

  17. apteryx says:

    Qetzal – Thanks for your efforts in looking it up. It certainly sounds like, if there is any effect at all, it is not a large one, which probably means it would be negligible in vivo. (Given the plethora of reasons for variation among studies, I would not be inclined to presume that researcher bias or fabrication was responsible for whichever results I do not like.)

    Dr. Wallace writes:

    “Ignored at risk ofthe history. For instance, looking up echinacea, there never was a common folk use ffor colds and flus of ech. in Native American tribes …the idea gererated in the heads of an American eclectric practitioner of the 1800s and a Swiss naturopath who saw marketing possibilities in Europe – and did it.”

    I’m not sure of the relevance, since microwave radiation never had a common folk use either and we are employing that in our daily lives to our (or AT&T’s) benefit. Anyway, I must correct the facts. Indigenous use of Echinacea specifically for respiratory ailments was certainly much less widespread than the use of, say, yarrow or Lomatium; however, E. pallida was reportedly used for colds, fever, and several viral diseases, E. purpurea for cough, and E. angustifolia for sore throat, cough, tonsillitis, and mumps (among other things). This is from the authoritative reference (Daniel Moerman, Native American Ethnobotany). Attributing Native traditional knowledge to “the heads of” white people would be perceived by many Native Americans as cultural piracy. The credentialled Europeans you reference undoubtedly did not select echinacea at random to treat viral infections.

  18. apteryx says:

    Whoops, that would be Dr. Sampson, of course.

  19. daedalus2u says:

    One of the largest difficulties with the idea that low frequency EMF causes problems is that natural exposure to fluctuating magnetic fields is very high. The Earth’s magnetic field is about 50 microTesla. If you spin around in one spot, you have just subjected your entire body to a 100 microTesla change. From +50 to -50 and then back again. That is ~250x higher than the cutoff for the exposure level considered important in the exposure to power lines (0.4 microTesla). It is very hard to imagine a “damage” mechanism caused by exposure to levels 2 orders of magnitude below natural levels.

    There are a couple more possible mechanisms, if “natural” magnetic field changes are used to regulate physiology, perhaps to modulate biorhythms, or to adjust physiology to the seasons, or is part of why many women have their cycle entrained with the moon’s cycle, disrupting that signaling might have adverse effects on health. This would be though a regulatory effect, not through injury. It would likely only show up in whole organisms and might be quite idiosyncratic to humans. It might also not have any type of dose-response proportionality. The “signal” is the timing, not the magnitude. Because of the very large fluctuations that occur with organism movement, the only time a reliable signal could be obtained would be when the organism is stationary, as for example during sleep. Disruption of normal biological rhythms could conceivably have a broad range of non-descript and idiosyncratic symptoms. If exposure during sleep is important, then what is important could be things like electric blankets and analog alarm clocks, not cell phones. Both of these can produce very high 60 Hz magnetic fields.

    Ferromagnetic particles are quite ubiquitous in the environment. Little bits of magnetite are quite stable and can be found everywhere. Iron bearing micrometeorites form ferromagnetic particles when they enter the Earth’s atmosphere. When these ferromagnetic particles are subjected to a time varying magnetic field, they spin, just like a magnetic stir bar. A spinning particle inside a cell would be extremely disruptive of that cell’s normal physiology. Cells in vitro likely have greater exposure to particles like this than cells in vivo.

  20. Harriet Hall says:

    Do many women have their cycle entrained with the moon’s cycle? I seriously doubt that; I suspect superstition, magical thinking, and confirmation bias.

  21. Wallace Sampson says:

    Apteryx:

    The relevance of history is not that there was no historical natural use of microwave radiation, but is found in what follows – that the only reference for an association at the time seemed to be that talk show phone call from a man whose wife had a brain tumor and who wanted to sue Motorola or one of the producing companies. My recollection of detail is a bit cloudy on this and I did not keep the news articles, but that’s the best I can do. There was no other reason for making a claim of association of cell phone radiation with brain tumor.
    There was one report of six brain tumors over a 6 – 24 month period in San Jose Calif. which was provisionally blamed not on cell phones – not popular at the time – but overhead power lines and appliances. This was determined to be a natural cluster.
    Regarding echinacea use by Native Americans, my references – 16 texts on the subject – and my primary source, Vogel’s Native American Medicine make no mention of respiratory infections or a group of symptoms resembling same. They list some nineteen tribes and twenty odd specific conditions such as snakebite, headache, toothache, wounds (local application) and inhaling the smoke from burning echinacea. One use was for sore throat, but as a local anesthetic. I found only one reference to colds/flus in a web page on herbs from Univ. of North Carolina, but I received no confirmation on requesting a primary source. So I am indebted to you for your reference, which I would like somehow to verify.
    In a brief review of this a few years ago, I made the point that all reports of usage must be open to doubt because of language and interpretation – from one of many dialects to French, Spanish, and/or English, the natural variations in verbal transmission, and the fact that conditions were caled by different names in the 16th-19th centuries than they are known by now, and some disorders no longer exist (quinsy,) or are known to be multiple different problems (dropsy).
    Fact remains that Ech. was eventually marketed for colds and flus specifically without similar documented No. American use.
    I hope we agree that the cell phone issue is a case where basic science trumps clinical epidemiology.

  22. Joe says:

    Apteryx,

    Whether or not Echinacea was historically used to treat the “cold” is irrelevant in light of the fact that it has not been proven to work for that problem.

    However, one, major data base does not list that use:
    http://www.ars-grin.gov/cgi-bin/duke/ethnobot.pl
    Whereas another does:
    http://herb.umd.umich.edu/

    Note that the citations in the second case are recent (the last thirty years) which brings up another problem: Are these people reporting historical use, or has the information been contaminated with modern notions? That is another reason one needs to look at the original sources.

  23. daedalus2u says:

    Not all women have their cycles entrained by the lunar cycle, but significantly many do.

    http://www.ncbi.nlm.nih.gov/pubmed/3716780

    The mechanism by which that happens remains unknown. Because steroids are involved, and steroids are both synthesized and degraded by cytochrome P450 enzymes which co-generate and are regulated by superoxide and other radical-like species with unpaired electrons, it is not inconceivable that there is some sort of magnetic field modulation of something.

    There are not many other potential mechanisms. Light may have worked before humans had artificial light but not now. Gravity would seem unlikely because the change is slow and very small. Perhaps women have their cycles entrained with the lunar cycle due to a placebo effect, that is it entrains because they expect it to be entrained. There is a tendency for women living together to have their cycles synchronize.

  24. daedalus2u says:

    There is a recent review of lunar effects.

    http://www.ncbi.nlm.nih.gov/pubmed/16407788

    The details of how the experiments are done may have large and unrecognized significance, for example if animals are housed in a cage made of steel wire, that may significantly attenuate any magnetic field fluctuations.

  25. Fifi says:

    “There is a tendency for women living together to have their cycles synchronize.”

    Yes but this has nothing to do with the moon or woo. I’ve never seen any direct evidence that women’s menstrual cycles do actually sync up with the moon by anything but chance (and temporarily) but certainly women I know who find this kind of thing mystical and a sign of their interconnectedness and personal harmony with nature note it and consider it significant when it happens. I’ll check the study you posted but I’m suspecting the same confirmation bias as Harriet since this is what I’ve observed generally.

  26. apteryx says:

    Dr. Sampson writes:
    “I hope we agree that the cell phone issue is a case where basic science trumps clinical epidemiology.”

    Yes, since the epidemiological evidence so far seems to be pretty weak. I do not agree with some people around here that any level of epidemiological or clinical trial evidence should be rejected if the “basic science” were not already there. If cell phone users had five times the rate of brain cancer, in multiple studied populations, I would consider it cause for alarm even if there had been no plausible explanation offered for how that could happen.

    You are quite right to denounce the practice of “epidemiology by victim,” whereby causality is attributed without evidence to a statistically meaningless handful of cases. I hope I can cite that the next time an alarm is raised about some common botanical because four people who have taken it had liver diseases. ;-)

  27. Harriet Hall says:

    “There is a tendency for women living together to have their cycles synchronize.”

    I’m not sure I believe this any more than synchronization with the moon. Yes, I know there are studies, but I’m skeptical. I can think of too many things that could have led to misleading results. The phenomenon is particularly prone to confirmation bias. Everyone “knows” more patients come to the ER when there is a full moon, but that turns out not to be true.

    From the review Daedalus posted: “a number of reports find no correlation between the lunar cycle and human reproduction and admittance to clinics and emergency units”

    I’m keeping an open mind until I see more convincing data. Meanwhile, I’m not going to waste my time speculating about possible mechanisms for a phenomenon that may not even exist.

  28. daedalus2u says:

    “Meanwhile, I’m not going to waste my time speculating about possible mechanisms for a phenomenon that may not even exist.”

    Don’t worry, there are plenty of us to take up the slack ;)

  29. Will TS says:

    Daedalus,

    It’s an interesting idea that changes in earth-strength magnetic fields are responsible for physiological rhythms that are correlated to lunar cycles. Does that mean that the strength of the earth’s magnetic field varies with the phase of the moon? Perhaps NASA has some publicly available data to demonstrate this. Can you provide a link? How is the lunar magnetic effect transmitted to the earth, since the moon has no magnetic field of its own? Is the lunar variation in the earth’s magnetic field greater in magnitude than geographic variations in field strength? Would it be negated by movement through the magnetic field that people might experience by walking? Is the field strength variability large enough to have systematic effects on soluble radicals in physiological systems at 37C? Is the signal strength large enough to be detectable in thermal noise within cells? Does the geographic alignment of compounds with unpaired electrons affect their chemical properties? How would your proposed magnetic effect be transduced into a biochemical process? Are we compromising the stability of physiological systems by having magnets on our refrigerators? I would love to see some references that address these issues.

  30. daedalus2u says:

    I haven’t done much (if any) actual research into this, just some out of the box thinking (aka idle speculation). I suspect that it is not a “magnitude” effect, but rather a characteristic frequency effect. The moon does pass between the Earth and the Sun and must have a substantial effect on the solar wind striking the Earth during the new moon. The solar wind does have strong effects on the Earth’s magnetic field. Because the magnetic axis aligns more with the rotational axis (which is tilted) and not the axis of the moon’s orbit, the angle with which the solar wind strikes the Earth’s magnetic field depends on the season, and so the influence of the moon on that also changes with the season.

    Originally I started thinking of this in the context of astrology. The “conventional explanations” of effects of season of birth on personality are obviously non-physiologic and cannot be correct. There are no known physiological processes that could conceivably couple to gravity at those levels. However, human infants are born defenseless and are solely dependant on adults for a long time. The only way that an infant could increase its survival during that period would be by modulating the care that those adults would give it. The only way that can happen is via social interaction between the infant and adults. If a different social interaction “style” (aka personality) optimized infant survival even a tiny amount depending on the season of birth, over evolutionary time there would be very strong selection. There are many different causes of infant mortality in prehistoric times, each one would have a different seasonal dependence, each one could be “optimized for” via different personality traits. For example being clingy when it is cold and rainy might protect against hypothermia, and predators, but might foster the transmission of skin diseases. Over evolutionary time each woman had on average 2 children survive and reproduce. If a trait that “optimized” personality depending on season of birth increased that survival to 2.002 (1 extra child in 1,000) the trait becomes universal in a few tens of thousands of generations (1-(1/(1.001)^20,000))= 0.999999998.

    Obviously personality at birth has to be mediated by brain structures present at birth. For the fetus to get a “signal” to indicate what neuroanatomy needs to be grown, that “signal” needs to reach the fetus in utero. Magnetic fields and magnetic field fluctuations could do that. There is probably a “default” personality (which may be modified by other aspects of the in utero environment such as genes, maternal nutrition, age, parity, stress, disease, etc) which is then further modified by this hypothetical seasonal effect. The “seasonal” effect may be different in different geographical regions. Humans from one region may have adapted to one trait for one season and in another region to the opposite trait. There are probably so many possible variations even of identical genotypes that sorting it out (assuming any effects are even real) may not be possible.

    http://www.ncbi.nlm.nih.gov/pubmed/16100517

    This paper does discuss some of the physics behind unpaired electron mediated magnetic effects. Essentially when a donor molecule with an electron pair has an electron removed, you end up with two unpaired electrons. Those electrons have spin, which does couple to a magnetic field and the magnitude of the field modulates the flipping of the spins. If one of the spins does flip, then recombination is inhibited. There are many thousands of pathways that transfer electrons from a donor to a receptor. Changing the relative timing of the forward and back reactions could have many largely unknown effects. All of the steroids are synthesized and metabolized by the cytochrome P450 enzymes which are regulated by NO and also generate superoxide. They would be candidates for this type of behavior. The largest class of transcription factors are zinc finger proteins. Metallization of zinc finger proteins is mediated through NO.

    Here is an example where the lack of a magnetic field adversely affects development in an amphibian.

    http://www.ncbi.nlm.nih.gov/pubmed/1930306

    They discuss abnormal development of the spine. Bone growth and mineralization is regulated in part by nitric oxide which does have an unpaired electron. Many of the growth factors and transcription factors active in neurodevelopment have effects mediated through NO. A characteristic frequency of magnetic field might entrain neuron activation through stochastic resonance (if the neuron was tuned to do so). Low folate does cause neural tube defects, as does both too much NO (as from NO donors) and too little NO (as from inhibition of nitric oxide synthase). Folate does “rescue” neural tube defects caused by NO donors. NO does regulate neuroproliferation in multiple ways in the developing nervous system.

    I think that any magnetic field effects would be considerably smaller than the neurodevelopment effects I am looking at mediated through basal NO via stress pathways (but different because they could be frequency dependant). Multiple organs are known to be programmed in utero (heart, liver, vasculature, endocrine system) and some of that programming is known to be mediated through NO. It would be beyond surprising if the most important organ, the brain, were not programmed in utero. All of these effects are non-linear and coupled. Systems of even a handful of coupled non-linear systems are completely intractable to model and predict. They exhibit chaotic behavior. Just as we cannot predict the weather in detail, we should not expect to be able to predict the details of neurodevelopment which is many orders of magnitude more complicated.

  31. Harriet Hall says:

    Daedalus refers to The “conventional explanations” of effects of season of birth on personality.

    There’s no point in trying to explain the mechanism for a phenomenon that doesn’t exist.

  32. Will TS says:

    Daedalus,

    Thanks for the interesting references. I still have questions. The Johnsen and Lohmann paper concludes that the mechanism through which migratory animals detect the earth’s magnetic field remains undescribed. Apparently, ferromagnetic materials had been found in neurons in the trigeminal nerve of pigeons, but no electrophysiological response has been demonstrated in response to changing field alignment. Conversely, a neural response to field alignment has been observed in the marine mollusc Tritonia but no receptor organ has been identified. You seem to have some more recent information. Can you elucidate?

    As for the effects of the moon on the magnetic field, what is the ‘characteristic frequency’ of the earth’s magnetic field? I always thought it was very nearly static. I guess I was also mistaken about the source of the magnetic field. I naively assumed that it was created by dynamo effect of the metallic core of the earth. Thanks for explaining that its caused by solar wind. How does the moon affect the field during months when the orbit of the moon does not cause it to pass directly between the earth and sun, which I believe is most months. We must be able to predict how characteristic frequencies vary with moon phase, solar intensity, season, etc. Right?

    If gravity is too weak to influence biochemical processes, how significantly does the much weaker magnetic field do so? Is it all based on magnetically induced changes in nitric oxide activity? Are all oxides magnetic? Dihydrogen monoxide? Haven’t chemists and chemical physicists examined electron transfer in magnetic fields? Can chemical engineers catalyze reactions with weak magnetic fields? Does that principle apply to physiological systems where soluble compounds are randomly oriented? Don’t magnetic materials, like ferromagnetic metals, have to be held in a crystalline structure to maintain a magnetic polarity? Are zinc finger proteins crystalline? They have zinc.

    Has the 1991 study on amphibian development been replicated? It must be standard laboratory practice by now, especially since they have developed a reliable method to generate newts with two heads. Are developmental abnormalities more prevalent in natural environments with weaker magnetic fields? I was under the impression that the ambient magnetic field at the equator was about 30 microTesla, whereas the field strength at the poles was about 60 microTesla. There must be a higher incidence of birth defects at low latitudes. Is that true? And how does that it with your hypothesis that cytochrome P450 activity is controlled by cosmic rays?

  33. Harriet Hall says:

    Any hypotheses about the Earth’s magnetism must take into account that it is constantly changing.

    When I was looking into feng shui’s recommendation to point the head of your bed towards the north, I discovered that:

    (1) The poles will reverse again, at which time feng shui will have to reverse its recommendations.

    (2) Feng shui doesn’t tell us whether to use magnetic north or geographic north. The magnetic north pole isn’t at the north Pole. It’s currently in Canada and is gradually working its way towards Siberia. See map at http://gsc.nrcan.gc.ca/geomag/nmp/long_mvt_nmp_e.php
    In some locations, using a compass to point your bed north would actually point it east, west, or south.

    (3) The magnetic north pole moves around an elliptical course across an 85 km distance every day. If you used a sensitive compass to point your bed “north” you would have to put your bed on a moving platform. Diagram at:
    http://gsc.nrcan.gc.ca/geomag/nmp/daily_mvt_nmp_e.php

    Sometimes exploring the logical consequences of a hypothesis makes it look less appetizing.

Comments are closed.