Why does anyone believe this works? The dangers of cupping.

Cupping therapy. If this looks painful and possibly damaging
to the skin, that's because it is.
People are easily fooled. Even smart people.

I'm not talking about voters in the U.S. and the UK, although both groups have recently demonstrated how easily they can be conned into voting against their own interests. You can read plenty of articles about that elsewhere.

No, I'm talking about the wide variety of health treatments that call themselves alternative medicine, integrative medicine, traditional Chinese medicine, energy medicine, and other names. These are all just marketing terms, but many people, including some physicians and scientists, seem captivated by them.

This week I'm going to look at "cupping," a rather bizarre treatment that, for reasons that escape me, seems to be growing in popularity.

I just returned from a scientific conference, where I happened to speak with an editor for a major scientific journal who also follows this blog. She remarked that she liked some of my articles, but she disagreed with me about cupping, which I wrote about during the 2016 Olympics, where swimmer Michael Phelps was observed to have the circular welts that are after-effects of cupping. This editor's argument boiled down to "it works for me," which left me somewhat flabbergasted.

And just two weeks ago, when I was at my physical therapist's office getting treatment for a shoulder injury, I heard her discussing cupping with another therapist. I then noticed a large box containing cupping equipment on one of the counters. Thankfully, my therapist didn't suggest cupping for me; I'm not sure how I would have replied.

What is cupping? It's a technique where you take glass cups, heat the air inside them, and then place them on the skin. Because hot air is less dense, it creates suction as it cools, which sucks your skin up into the glass. (Some cupping sets use pumps rather than heat to create this effect.) Imagine someone giving you a massive hickey, and then doing another dozen or so all over your back, or legs, or wherever the cupping therapist thinks you need it. If that sounds kind of gross, it is.

Quacktitioners Practitioners of cupping think that it somehow corrects your "qi," a mysterious life force that simply doesn't exist. When pressed, they often remark that it "improves blood flow," a catch-all explanation that has no scientific basis and that is more or less meaningless. What really happens, as the physician and blogger Orac noted, is this:
"The suction from cupping breaks capillaries, which is why not infrequently there are bruises left in the shape of the cups afterward.... If you repeatedly injure the same area of skin over time ... by placing the cups in exactly the same place over and over again, the skin there can actually die."
So maybe cupping isn't so good for you.

Cupping is ridiculous. There's no scientific or medical evidence that it provides any benefit, and it clearly carries some risk of harm. A recent review in a journal dedicated to alternative medicine–one of the friendliest possible venues for this kind of pseudoscience–concluded that
"No explicit recommendation for or against the use of cupping for athletes can be made. More studies are necessary."
Right. That's what proponents of pseudoscience always say when the evidence fails to support their bogus claims. Let us do more studies, they argue, and eventually we'll prove what we already believe. That's a recipe for bad science.

Even NCCIH, the arm of NIH dedicated to studying complementary and integrative nonsense medicine, can't bring itself to endorse cupping. Their summary states:

  • There’s been some research on cupping, but most of it is of low quality.
  • Cupping may help reduce pain, but the evidence for this isn’t very strong.
  • There’s not enough high-quality research to allow conclusions to be reached about whether cupping is helpful for other conditions.

In other words, some bad scientists have conducted a few studies but haven't proven anything. But wait, it gets worse. NCCIH goes on to warn that:

  • Cupping can cause side effects such as persistent skin discoloration, scars, burns, and infections, and may worsen eczema or psoriasis. 
  • Rare cases of severe side effects have been reported, such as bleeding inside the skull (after cupping on the scalp) and anemia from blood loss (after repeated wet cupping). 

And still, otherwise intelligent people say "it works for me." I'm left speechless.

The bottom line: save your money and your skin. Don't let anyone suck it into those cups.

Measles is back. Blame the anti-vaxxers.

In the year 2000, the CDC announced that measles had been eliminated from the U.S. This was a fantastic public health achievement, made possible by the measles vaccine, which is 99% effective and which has virtually no side effects.

Unfortunately, measles is back. Just last week, the CDC announced that we've had at least 695 cases this year, the most since 2000, primarily from 3 large outbreaks, one in the state of Washington and two in New York. Because the CDC's surveillance is far from perfect, the true number of measles cases is likely much higher. And we're only four months into the year.

Also this week, UCLA and CalState-LA had to quarantine over 700 students and staff members who were exposed to measles from an outbreak in the Los Angeles area. At UCLA, one student who had measles attended multiple classes while still contagious, exposing hundreds of others to the highly contagious virus, according to a message from the university's chancellor.

No one has died as of yet, but if we don't quash these outbreaks, it's only a matter of time before someone will die. Measles has a fatality rate of 0.2%, or 2 deaths per thousand cases. That may sound small, but it's truly frightening when you consider that the U.S. had an estimated 500,000 cases per year before the vaccine was introduced in 1963.

Given the risks of measles, and given the remarkable effectiveness and safety of the vaccine, why don't people vaccinate their children? The primary reason is simple: it's the highly vocal, supremely confident, and utterly misinformed anti-vaccine movement. Anti-vaxxers spread their message daily on Facebook, Twitter, websites, and other media outlets. (I will not link to any of them here because I don't want to increase their influence.) They have launched systematic efforts throughout the U.S. and in other countries to convince parents not to vaccinate their children, claiming that vaccines cause a variety of harms, none of which are correct. (I won't list those here either, because even mentioning them gives the claims credibility.)

In one of the two outbreaks in New York, anti-vaxxers distributed highly misleading pamphlets in an effort to convince parents in an ultra-religious Jewish community not to vaccinate their kids. The anonymously-published pamphlet was "filled with wild conspiracy theories and inaccurate data," but it seems to have worked, as least among some of the parents.

The anti-vax movement is also behind the state-by-state effort to allow parents to opt out of vaccinations for their children in public schools. We're finally seeing some states roll this back, but it is still far too easy for parents to claim an "ethical" or "religious" exemption, allowing them to put their unvaccinated kids in school and thereby expose countless other children to measles and other preventable diseases. (I put those words in quotes because there is no valid ethical or religious objection to vaccines. All major religions strongly support vaccination.) Anti-vax websites provide how-to instructions telling parents how to get exemptions for their kids, and a small number of anti-vax doctors (I'm looking at you, Bob Sears) readily dispense large numbers of anti-vax exemptions. This needs to end.

The modern anti-vaccine movement started in 1998, with a fraudulent paper about the measles, mumps, and rubella vaccine, published by a former doctor who lost his medical license after the fraud was revealed. The lead author was eventually revealed to have taken large sums of money (unbeknownst to his co-authors) from lawyers who were trying to build a case to sue vaccine makers. That same ex-doctor, who I also won't name here (his initials are AW), is now a hero to the anti-vax movement, and he travels the world spreading his toxic message. He's even made an anti-vax movie.

I sincerely hope we won't see any children die before the anti-vaccine movement finally goes away. For any parents who are thinking that they won't vaccinate their kids, I urge them to read the heartbreaking words of Roald Dahl (author of Charlie and the Chocolate Factory, The BFG, and many other wonderful books), whose oldest daughter Olivia died of measles in 1962, at the age of seven:
"...one morning, when [Olivia] was well on the road to recovery, I was sitting on her bed showing her how to fashion little animals out of coloured pipe-cleaners, and when it came to her turn to make one herself, I noticed that her fingers and her mind were not working together and she couldn't do anything.
'Are you feeling all right?' I asked her.
'I feel all sleepy,' she said.In an hour, she was unconscious. In twelve hours she was dead.The measles had turned into a terrible thing called measles encephalitis and there was nothing the doctors could do to save her." 
The measles vaccine was a miracle of modern medicine, and it's been administered safely to hundreds of millions of people. Measles is a dangerous illness, but we can prevent it. No parent should have to go through what Roald Dahl went through.

Climate change is making us sneeze

Allergy sufferers are having a rough time of it this spring. If you're among them, and if you think it's getting worse, you're right–and climate change is at least partly to blame.

Admittedly, warming climate has far more severe consequences, such as the eventual flooding of entire coastal cities. On a personal level, though, pollen allergies make people pretty miserable. (I write this as a lifelong sufferer myself.) When springtime comes and trees burst into buds, some of us shut all the windows and huddle inside.

I hadn't thought that climate change would affect the pollen season until I read a newly published study in a journal called The Lancet Planetary Health. (Aside: yes, there really is a journal with that name, a specialty journal created two years ago by the venerable publishers of The Lancet.)

The new study, by USDA scientist Lewis Ziska and colleagues from 15 other countries, looked at airborne pollen data from 17 locations, spanning the entire globe, and stretching back an average of 26 years. The news isn't good for allergy sufferers:
"Overall, the long-term data indicate significant increases in both pollen loads and pollen season duration over time."
In other words, it's a double whammy: we getting more pollen than ever before, and the allergy season last longer. Okay, not that much longer, only an average of one day. But if you have hay fever, every day is one too many.

To be fair, not every location experienced a significant increase in pollen. Here are the 12 (of 17) that did:
  • Amiens, France
  • Brussels, Belgium
  • Geneva, Switzerland
  • Kevo, Finland
  • Krakow, Poland
  • Minneapolis, USA
  • Moscow, Russia
  • Papillion, USA
  • Reykjavik, Iceland
  • Thessaloniki, Greece
  • Turku, Finland
  • Winnipeg, Canada
Perhaps not coincidentally, the pollen season this spring is making headlines in the U.S. As the NY Times reported this week, "extreme" pollen has blanketed the middle of North Carolina this week. It's so bad that the air has taken on a yellowish tinge, as shown in this unaltered photo, one of several taken by photographer Jeremy Gilchrist and shared last week on social media.
A yellow haze caused by pollen over Durham, North Carolina
in April 2019. Photo credit: Jeremy Gilchrist via Facebook.

According to Ziska et al.'s study, more pollen-filled springs are the new normal. Their projections indicate that pollen seasons will continue to get longer in the future, and that the amount of pollen in the air will also increase during the spring and again in the fall, when ragweed pollen is at its peak.

What can you do about spring allergies? I wrote about this last year: for some people, over-the-counter antihistamines help, although they only treat the symptoms. Allergy shots can provide long-term relief, if you have the time to go through the months-long regimen. Other than these options, the best you can do is stay inside and wait for pollen season to end. You can always catch up on your reading of The Lancet Planetary Health.

NEJM says open access is unnecessary. Right.

Surprise: the New England Journal of Medicine thinks open access is a bad idea. Open access is the model of scientific publishing in which all results are freely available for anyone, anywhere, to read.

This week NEJM published an editorial by one of their correspondents,  Charlotte Haug, that purports to present an objective look at open access publishing, and finds that the "experiment" has failed, and that free access to scientific publications hasn't delivered on its promises.

What is NEJM worried about? Their expensive, exclusive model of publishing–where everyone has to pay high subscription fees, or else pay exhorbitant fees for each article they read–is threatened by scientists who want all science to be free. Pesky scientists!

NEJM is especially worried about "Plan S", a proposal in Europe to require that all scientists whose work is funded by the public be required to publish their results in open-access venues. Plan S is due to take effect very soon, in 2020 for 11 research funders in Europe.

The NEJM article is a clever but deeply flawed effort to prove that open access isn't working. It's full of fallacies and straw men, so much so that it's hard to know where to begin. Since they're not playing fair, though, I won't either: I'll cherry-pick three of Haug's arguments and explain why she's wrong about each one.

But first, to set the stage, let's remind everyone of what we're talking about. Scientific papers are written by scientists (like me), who are largely funded to do their work by governments, non-profit organizations, and occasionally by commercial companies. The writing is done by the scientists themselves, who submit papers to journals for peer review. The peer reviewing is also done by scientists (again, like me) who do this work for free. The journals pay nothing for all this work.

In other words, we do all the work for free, using funding provided by the public, and the journals then take that work and sell it for a very tidy profit. (Richard Smith estimates that NEJM itself has an income of $100 million with a 30% profit.) The vast majority of scientific and medical journals are owned by five for-profit corporations, as the NEJM points out:
"The five largest publishing houses (SAGE, Elsevier, Springer Nature, Wiley-Blackwell, and Taylor & Francis) continue to grow, with high profit margins."
For the past two decades, scientists have spoken out more and more over the outrageous practices of for-profit publishers, whose subscription fees and profits have grown while the costs of distribution have plummeted. Virtually everyone gets scientific papers online now. Why sign over copyright when we can distribute our work so cheaply now? The open access movement was founded to provide an alternative: open access journals allow everyone to read all the content for free, and the authors retain their copyrights.

Now let's look at the NEJM article. Haug starts by pretending to agree that open access is a good thing, writing:
"The idea — that the results of research should be available to be read, discuss, and examine... — has few, if any, opponents in either the scientific community or the public."
Reading this, you might think that Haug (and NEJM, by extension) are fans of open access. They are not.

Haug then proceeds (she thinks) to dismantle the arguments in favor of open access. First, she states that publishing costs have not dropped, but have increased. As evidence, she asserts that "Electronic production and maintenance of high-quality content are at least as expensive as print production and maintenance." This claim is, frankly, nonsense, but since Haug doesn't cite any evidence to back it up, there's nothing really to refute. It's obviously much cheaper to post a PDF on a website than to print thousands of hardcopies and physically ship them to libraries around the world. If costs are going up (and again, Haug cites no evidence), that could be simply because publishers are paying themselves higher salaries (NEJM reported compensation of  $703,324 for its chief editor in 2017), or hiring large staffs, or renting luxurious offices–who knows? Haug doesn't explain.

In any case, the costs of publishing at NEJM, a closed-access, subscription-based journal, have little to do with whether or not scientific and medical research should be freely available.

Her next argument against open access is that the most highly-cited journals are subscription-based, like (ahem) NEJM. My response: so what? Everyone within academia knows that it takes a very long time to establish a reputation as a "top" journal, and young scientists will always want to publish in those journals, regardless of how expensive they are. This has given closed-access journals like NEJM (and Nature, Science, JAMA, and Cell, to name a few more) tremendous power, which they have wielded to fight against open access at every opportunity. This editorial represents another example of that fight. The fact that many scientists still want to publish in these journals doesn't mean they should keep the results locked behind a paywall.

Setting aside this tiny number of "prestige" journals, open access papers do get cited more, as was demonstrated by this study from 2016. The evidence shows that open access does lead to higher impact: papers that are freely available are read more and cited more.

Finally, let's turn to Haug's coup de grace, which she wields near the end of her piece, as a sort of "proof" that open access is really unnecessary. Here she argues that NEJM is already open, mostly:
"About 98% of the research published in the Journal since 2000 is free and open to the public. Research of immediate importance to global health is made freely accessible upon publication; other research articles become freely accessible after 6 months."
First, let's acknowledge that merely by pointing this out, Haug is admitting that the main arguments for open access are legitimate; i.e., that it's a huge benefit to society to make research freely available. I'm going to agree with her here.

What Haug doesn't mention here is that there is one reason (and only one, I would argue) that NEJM makes all of its articles freely available after some time has passed: the NIH requires it. This dates back to 2009, when Congress passed a law, after intense pressure from citizens who were demanding access to the research results that they'd paid for, requiring all NIH-funded results to be deposited in a free, public repository (now called PubMed Central) within 12 months of publication.

Scientific publishers fought furiously against this policy. I know, because I was there, and I talked to many people involved in the fight at the time. The open-access advocates (mostly patient groups) wanted articles to be made freely available immediately, and they worked out a compromise where the journals could have 6 months of exclusivity. At the last minute, the NIH Director at the time, Elias Zerhouni, extended this to 12 months, for reasons that remain shrouded in secrecy, but thankfully, the public (and science) won the main battle. For NEJM to turn around now and boast that they are releasing articles after an embargo period, without mentioning this requirement, is hypocritical, to say the least. Believe me, if the NIH requirement disappeared (and publishers are still lobbying to get rid of it!), NEJM would happily go back to keeping all access restricted to subscribers.

The battle is far from over. Open access advocates still want to see research released immediately, not after a 6-month or 12-month embargo, and that's precisely what the European Plan S will do.

With Plan S looming, I've no doubt we'll see more arguments against open access in the coming months, but scientists have at least one ace up our sleeves: we're the ones who do all the work. We do the experiments, we write the papers, and we review the papers. Without us, the journals would cease to exist. The journals will have no choice but to go along with plan S, because without the scientists, they'll have nothing to publish. Let's hope the U.S. will follow suit in the very near future. It's long past time to change the archaic, closed-access policies that have kept medical and scientific results–results that were funded by the public–locked behind the paywalls of for-profit publishers.

Salty and saltier: fast food has more sodium than ever before

High blood pressure is one of the biggest health problems in the U.S. today. The CDC estimates that 75 million American adults, about one-third of the adult population, has high blood pressure. Even more alarming is that high blood pressure "was a primary or contributing cause of death for more than 410,000 Americans in 2014," the last year for which the CDC reports data.

One of the main causes of high blood pressure (a.k.a. hypertension) is too much salt in the diet. As Americans have eaten out more and more, they've grown less aware over how much salt goes into their foods. Salt is tasty but invisible: you can't know exactly how much salt is in your food if you didn't prepare it yourself.

Everyone knows that fast foods can be salty, especially those (like French fries) that have salt sprinkled all over them. What they don't know, though, is that over the past 30 years, the amount of salt in fast foods has increased dramatically, as revealed in a new study just published by Megan McCrory and colleagues at Boston University.

The new study looked at how portion sizes, calories, sodium (salt), calcium, and iron changed in major fast food chains between 1986 and 2016. They analyzed data from these 10 restaurants:
Arby’s, Burger King, Carl’s Jr, Dairy Queen, Hardee’s, Jack in the Box, KFC, Long John Silver’s, McDonald’s, and Wendy’s
They would have looked at more, but others either didn't have data available or didn't have foods in all the categories under study.

Portions and calories all increased over the past 30 years, but I want to focus on the salt.

Back in 1986, sodium content in entrees averaged 36% of the recommended daily allowance–which is pretty high for a single entree (a burger, say). Bad as that is, though, by 2016 this had increased to 47%. Thus a single fast-food entree has nearly half of an entire day's allowance of salt. Sides increased from 14% of the RDA to 26%, which means that if you have an entree and a side (fries!), you're getting 75% of your daily salt allowance. On average, they're adding 50% more salt today than in 1986.

And that's just the average: if you order larger sizes, or one of the saltier choices (though you may not be able to tell what those are), or more than one side dish, you can easily exceed 100% of your recommended salt intake for the day. (And by the way, for the vast majority of people, having less than the "allowance" of salt is just fine.)

I remember having fast food burgers and fries back in the 1980s, and they were quite tasty. I haven't noticed that they taste better today, and it's not clear why the chains increased the amount of salt so much. Presumably they did consumer testing and found that people like more salt, but it could also be simply that adding salt, which is a preservative, allows them to store the food supplies longer and save money.

Now, most people don't think fast food is healthy. It's popular because it tastes good and it's convenient. Nonetheless, for the large numbers of people who have high blood pressure (or pre-hypertension), the fact that salt has increased should be worrisome.

For those who want to do a little homework, you can easily find detailed nutrition facts for all the major chains online now. It took me only a few seconds to find downloadable lists for McDonaldsKFCWendy's, SubwayBurger King, and others, so you can compare all their items before your next visit.

For example, a McDonald's quarter pounder with cheese has 1110 mg of sodium, or 46% of your daily allowance. Their Bacon Smokehouse Artisan Grilled Chicken sandwich has far more, 1940 mg (81%), while their Filet-O-Fish, in contrast, has only 560 mg of sodium (23%). Side dishes can be surprisingly bad (or good) too. KFC's corn on the cob is a gem, with no sodium at all, but their BBQ baked beans weigh in with 820 mg of sodium.

The bottom line, though, is that if you want to eat less salt, the best way is to prepare your own food.  It's more trouble, but it's well worth the effort.

Scientists restart bioweapons research, with NIH's blessing

For more than a decade now, two scientists–one in the U.S. and one in the Netherlands–have been trying to create a deadly human pathogen from avian influenza. That's right: they are trying to turn "bird flu," which does not normally infect people, into a human flu.

Not surprisingly, many scientists are vehemently opposed to this. In mid-2014, a group of them formed the Cambridge Working Group and issued a statement warning of the dangers of this research. The statement was signed by hundreds of scientists at virtually every major U.S. and European university. (Full disclosure: I am one of the signatories.)

In response to these and other concerns, in October 2014 the U.S. government called for a "pause" in this dangerous researchNIH Director Francis Collins said that his agency would study the risks and benefits before proceeding further.

Well, four years later, the risks and benefits haven't changed, but the NIH has (quietly) just allowed the research to start again, as we learned last week in an exclusive report from Science's Jocelyn Kaiser.

I can't allow this to go unchallenged. This research is so potentially harmful, and offers such little benefit to society, that I fear that NIH is endangering the trust that Congress places in it. And don't misinterpret me: I'm a huge supporter of NIH, and I've argued before that it's one of the best investments the American public can make. But they got this one really, really wrong.

For those who might not know, the 1918 influenza pandemic, which killed between 50 and 100 million people worldwide (3% of the entire world population at the time), was caused by a strain of avian influenza that made the jump into humans. The 1918 flu was so deadly that it "killed more American soldiers and sailors during World War I than did enemy weapons."

Not surprisingly, then, when other scientists (including me) learned about the efforts to turn bird flu into a human flu, we asked: why the heck would anyone do that? The answers were and still are unsatisfactory: claims such as "we'll learn more about the pandemic potential of the flu" and "we'll be better prepared for an avian flu pandemic if one occurs." These are hand-waving arguments that may sound reasonable, but they promise only vague benefits while ignoring the dangers of this research. If the research succeeds, and one of the newly-designed, highly virulent flu strains escapes, the damage could be horrific.

One of the deadliest strains of avian flu circulating today is H5N1. This strain has occasionally jumped from birds to humans, with a mortality rate approaching 50%, far more deadly than any human flu. Fortunately, the virus has never gained the ability to be transmitted directly between humans.

That is, it didn't have this ability until two scientists, Ron Fouchier in the Netherlands and Yoshihiro Kawaoka at the University of Wisconsin, engineered it to gain this ability. (Actually, their work showed that the virus could be transmitted between ferrets, not humans, for the obvious reason that you can't ethically test this on humans.)

Well, Fouchier and Kawaoka are back at it again. NIH actually lifted the "pause" in December 2017, and invited scientists to submit proposal for this type of research. Fouchier confidently stated at the time that all he had to do was "find and replace" a few terms in his previous proposal and it would likely sail through peer review. It appears he was correct, although according to the Science article, his study has been approved but not yet actually funded. Kawaoka's project is already under way, as anyone can learn by checking the NIH grants database.

And by the way: why the heck is a U.S. funding agency supporting research in the Netherlands anyway? If Fouchier's work is so great (and it isn't), let the Netherlands fund it.

I've said it before, more than once: engineering the flu to be more virulent is a terrible idea. It appears the review process at NIH simply failed, as multiple scientists stated to Vox last week. This research has the potential to cause millions of deaths.

Fouchier, Kawaoka, and their defenders (usually other flu scientists who also benefit from the same funding) like to claim that their project to engineer a deadlier bird flu will somehow help prevent a future pandemic. This argument is, frankly, nonsense: influenza mutates while circulating among millions of birds, and no one has any idea how to predict or control that process. (I should mention that I know a little bit about the flu, having published multiple papers on it, including this paper in Nature and this paper on H5N1 avian flu.)

Fouchier and Kawaoka have also argued that we can use their work to create stockpiles of vaccines in advance. Yeah, right. We don't even stockpile vaccines for the normal seasonal flu, because it mutates too fast, so we have to produce new vaccines each year. And the notion that anyone can predict a future pandemic strain so precisely that we could design a vaccine based on their prediction is laughable.

I can't quite fathom why NIH seems to be so enraptured with the work of these two labs that, rather than simply deny them funding, it has ignored the warnings of hundreds of scientists and now risks creating a new influenza pandemic. Much as I hate to say this, maybe it's time for Congress to intervene.

Does RoundUp cause cancer?

(Quick answer: probably not. See my update at the bottom of this post.)

For many years, environmental activists have been concerned about the herbicide glyphosate, which is the main ingredient in RoundUp®, the world's most widely-used weed killer. Since 1996, global usage of glyphosate has increased 15-fold, in part due to the widespread cultivation of "RoundUp Ready" crops, which are genetically modified to be resistant to RoundUp®. This allows farmers to use the herbicide freely, killing undesirable weeds without harming their crops.

RoundUp®'s manufacturer, Monsanto, has long claimed that glyphosate is safe, and they point to hundreds of studies that support their argument.

Nonetheless, a new study raises the question again.

First let's look briefly at another recent study. A bit more than a year ago, in November 2017, a large study in the Journal of the National Cancer Institute looked at nearly 45,000 glyphosate users (farmers and other agricultural workers who apply glyphosate to crops). These "users" have a much higher exposure to RoundUp® than ordinary people. That study concluded:
"no association was apparent between glyphosate and any solid tumors or lymphoid malignancies overall, including NHL [non-Hodgkin lymphoma]."
They did find, though, that there was a trend–not quite significant–towards an increased risk for one type of leukemia, AML. This trend appeared in users who had the highest exposure to RoundUp®.

In the new study, by a group of scientists from UC Berkeley, Mount Sinai School of Medicine, and the University of Washington, the authors (L. Zhang et al.) decided to focus exclusively on people with the highest exposures to glyphosate. They point out that including people with low exposure, who might have no increased risk of cancer, tends to dilute risk estimates. Statistically speaking, this is undeniably correct, but it also means that their results may only apply to people with high exposures, and not to ordinary consumers.

The punchline from the new study: people with the highest exposure to glyphosate had a 41% higher risk of non-Hodgkin lymphoma.

One caveat to this finding is that it's a meta-analysis, meaning the authors did not collect any new data. Instead, they merged the results from six earlier studies including over 65,000 people, and they focused on those with the highest exposure levels.

Meta-analyses can be prone to cherry-picking; that is, picking the studies that tend to support your hypothesis. However, I couldn't find any sign of that here. The authors include a frank assessment of all the limitations of their study, and they also point out that multiple previous studies had similar findings, although most found smaller increases in relative risk. In the end, they conclude:
"The overall evidence from human, animal, and mechanistic studies presented here supports a compelling link between exposures to GBHs [glyphosate-based herbicides] and increased risk for NHL."
A couple more caveats are important. First, this finding is all about relative risk. Non-Hodgkin lymphoma is one of the most common cancers in the U.S. and Europe, but the lifetime risk for most people, according to the American Cancer Society, is just 1 in 42 (2.4%) for men and 1 in 54 (1.9%) for women. A 41% increase in relative risk increases those numbers to 3.4% (men) and 2.6% (women).

Second, this higher risk only applies to people with very high exposure to glyphosate: primarily people who work in agriculture and apply RoundUp® to crops. Ordinary consumers (including people who eat "Roundup Ready" crops) have a far, far lower exposure, and dozens of studies have failed to show any increased risk of cancer for consumers. For most of us, then, this new study should not cause much concern, but for agricultural workers, it does raise a warning flag.

[Update 18 Feb 7:45pm] After I posted this article, the scientists at the Genetic Literacy Project pointed me to Geoffrey Kabat's piece about the Zhang et al. study. Kabat did a deep dive into the studies that Zhang et al.'s work is based on and uncovered a critical flaw in the study, one that I hadn't found. More than half of the "weight" of the meta-analysis by Zhang, and by far the largest number of cancer cases, come from a single study by Andreotti et al. published in 2018. That study reported risks for 4 different time points: 5, 10, 15, and 20 years. It turns out, as Kabat reports, that only the 20-year period showed any increase in risk of cancer. The relative risks of cancer at 5, 10, and 15 years were actually lower in the group exposed to glyphosate, and yet Zhang et al. didn't mention this fact.


Now, no one thinks that glyphosate lowers the risk of cancer, but Zhang et al. did not report that they had cherry-picked in this way. At a minimum, they should have reported what their findings would be if they used the other time periods. I suspect that they'd have found no increased risk of cancer–but this wouldn't make for such a catchy headline. This omission on their part is a serious flaw, indicating that they (and their results) might have been unscientifically biased.

The bottom line: even in those with very high exposures to glyphosate, the evidence that it causes any type of cancer is very weak. And for ordinary consumers, there's nothing to worry about.

The NY Times is far too worried about 23andMe's genetic test

The New York Times decided to publish an editorial this weekend warning people to "be careful about 23andMe's Health Test." What are they worried about?

Although the NY Times article is accurate, the warning suggests that 23andMe misleading its customers somehow. Is it? I decided to take a look.

 I'm a customer of 23andMe, and I'm also a researcher in genetics and genomics, so I know quite a bit about how their technology works and about what it can reveal. I've looked at 23andMe's latest genetic health reports, and they are remarkably clear and accurate.

Let me illustrate by revealing part of my own genetic test results. First I looked at my results for BRCA1 and BRCA2, the two genes that the NY Times article discusses.

I don't have any of the harmful mutations in the BRCA genes, which 23andMe reports like this:
Notice that they immediately provide the caveat that more than 1,000 other variants in these genes have been linked to cancer risk, and they make it abundantly clear that they didn't test any of those. Here's what the NY Times article said:
"The 23andMe test can miss other possible mutations.... there are more than 1,000 other BRCA mutations that contribute to your breast cancer risk. The 23andMe test doesn’t look for any of them."

A reader of the Times might think, upon reading this, that 23andMe somehow hides this fact. But the Times article's warning is little more than a paraphrase of what 23andMe's website states.

The Times editors also caution that
"Just because you test negative for the few mutations that 23andMe screens for doesn’t mean that you won’t get breast cancer."
Duh. 23andMe explains this as well, and much more, such as:
"This test does not take into account other risk factors for breast, ovarian, prostate, and other cancers, such as personal and family health history." [from 23andMe]
23andMe also provides a wealth of information about the BRCA genes, including links to the scientific papers describing the genes and their link to cancer. I was very impressed by how thorough they are.

The Times editors focused only on the BRCA genes, but 23andMe also tests a handful of others (9, in my case). I looked at the APOE gene report, which has a mutation that has been linked to Alzheimer's disease. The bad variant is called ε4, and fortunately I don't have it.

Once again, the 23andMe site was very clear about what this means, providing a detailed table showing the risks for people with and without the mutation. In my case, they tell me that:
"Studies estimate that, on average, a man of European descent has a 3% chance of developing late-onset Alzheimer's disease by age 75 and an 11% chance by age 85."
Looking at the detailed table, one learns that if you have one copy of APOE ε4 variants, your risk of developing Alzheimer's by age 75 is 4-7%, and if you have 2 APOE ε4 variants–the worst case–then your risk jumps to 28%. The website provide links to 10 scientific papers with far more detail, for those who want to know the basis of these numbers. This is far more than most people will want to know, and I couldn't find any flaws in 23andMe's description of the science.

The Times editorial concludes with this:
"23andMe has said that its health tests can raise awareness about various medical conditions and empower consumers to take charge of their health information. But doctors and geneticists say that the tests are still more parlor trick than medicine."
That last statement is the most egregious misrepresentation by the Times editors. Who are these geneticists who call DNA testing a "parlor trick"? The genetic tests run by 23andMe, which use technology that is run daily at thousands of labs around the world, are nothing of the sort. They are a highly accurate assay that has been repeated millions of times and validated by hundreds of peer-reviewed studies. Usually the NY Times is one of the most thorough and accurate sources in the media, but they really dropped the ball this time.

The fact is, genetics is not fate. Identical twins, who share identical DNA, rarely die of the same causes. Thus even if you knew your genetic risks perfectly, for every mutation in your DNA, you might not find anything to change in your behavior. At most, you might learn that you should get mammograms or colonoscopies slightly more often. It's legitimate to argue that you won't learn anything useful from the 23andMe tests, but you will learn something about genetics.

As far as changing your behavior to reduce health risks, you don't need a sophisticated genetic test for that. Just eat more leafy greens.

[Note: although I'm a customer of 23andMe, I have no financial relationship with the company and I'm neither an advisor to nor an investor in them.]

Relish that coffee now. It might be extinct in 20 years.

Most people think there are two major kinds of coffee, arabica and robusta. (No doubt some people think the two kinds of coffee are regular and decaf, but I digress.) And it's true that almost all the coffee that you can find in the market, or at your local coffee shop, is made from one of these beans or from a blend of both.

Actually, there are 124 species of coffee. Unfortunately, as we learned in a new paper published last week in the journal Science60% of them are currently in danger of going extinct. The primary threats are habitat loss (caused by humans) and climate changes (also caused by humans).

Even though the term "endangered species" is more often used to refer to animals, we humans have already wiped out countless plant species, primarily through deforestation, and many more are going extinct each year. We will never know how many species have already been lost as we've chopped down rich rainforests to create grazing lands for cattle or monoculture plantations, but we do know that it's still going on.

Of the two major beans that we use for coffee, the better-tasting bean, arabica, is already endangered, according to the new study. Robusta coffees aren't bad, but as the new paper explains:
"Although robusta has some negative sensory qualities (e.g., tasting notes of wood and tobacco), it is favored in some instances for its taste, high caffeine content, and ability to add body to espresso and espresso-based coffees; it is now the species of choice for instant coffee."
If we don't do something to protect wild coffee species, we might soon be drinking nothing but robusta coffee.

If that seems implausible, recall that this already happened to the banana. In the mid-1900s, the entire worldwide production of bananas was basically wiped out by Panama disease, caused by a fungus. Before then, a tasty variety called Gros Michel was the dominant species, but thanks to the fungus:
"By 1960, the Gros Michel was essentially extinct and the banana industry nearly bankrupt. It was saved at the last minute by the Cavendish, a Chinese variety that had been considered something close to junk." (Source: NYTimes)
Now, the robusta bean is far better than "junk," but I for one prefer my arabica coffee.

The main threats to arabica (and robusta) are outbreaks of mold and fungal infections, not unlike the disease that wiped out bananas. Those 122 wild species–of which 60% are now endangered–are often resistant, allowing plant scientists to inter-breed the wild and domesticated varieties to create new strains that resist disease and taste just as good as the original. This wild "reservoir" of coffee is critical to saving coffee as we know and love it today.

You're probably accustomed to seeing coffee labelled by the region it's grown in, rather than the type of bean. This is similar to how we label wines as being from France, California, Australia, etc. Coffee is grown in many temperate regions, including Central and South America, Indonesia, central Africa, and Hawaii. Just as with wine, the climate makes a difference, but the bean itself is an even bigger factor. Consider the difference between cabernet, pinot noir, or sauvignon grapes for wine–in the same way, arabica bean coffees tastes quite different from robusta coffee.

If we don't pay attention to the threat to coffee, we might all be drinking a less-tasty brew in the years to come.

(Aside: I've been working for several years on a project to sequence Coffea arabica, the tetraploid genome of arabica coffee, and our results will likely be published soon. We're hoping that the genome will assist coffee scientists who are trying to breed new, disease-resistant varieties.)

The flu vaccine is working well this year. It's not too late to get it.

Current flu trends for 2018-19. Brown shows H1N1 strains,
red shows H3N2, and yellow indicates the strain was not
genotyped.
The flu is widespread and increasing right now, according to the CDC.  At least 42 states were reporting high levels of flu activity as of the end of December 2018, and the rates are still climbing. In other words, we're in the midst of flu season.

Other than that, though, the news is relatively good. Here's why.

First, the dominant strain of flu this year is H1N1, which is the "swine flu" that first appeared as a pandemic in 2009. But pandemics don't have to come with high mortality rates, and as it turned out–luckily for humankind–the 2009 flu was milder than the previous dominant strain, H3N2, which first appeared way back in 1968.

This season, nearly 90% of the flu cases tested by the CDC are turning out to be H1N1, the milder variety. Although 10% of people are still getting the much-nastier H3N2 flu, it's good news compared to last year, when H3N2 dominated.

Back to the bad news (although this is old news): the 2009 swine flu (H1N1) didn't completely displace the older flu strain. Instead, we now have both types of influenza circulating, along with two strains of the even milder influenza B virus. Since 2009, the flu vaccine has to combat all 4 of these flu viruses, which is why you might see the term "quadrivalent" associated with the vaccine. That just means it targets all 4 different strains.

Back to the good news again: the vaccine this year contains just the right strains! This doesn't always happen; actually it happens much less frequently than anyone would like. But now that the flu season is under way, the CDC can test the circulating flu viruses and compare them to the strains that are targeted by this year's vaccine. This year, both the H1N1 and the H3N2 viruses match the vaccine strains really well, which means that if you got the shot, you are likely to be very well protected.

(Keep in mind that even in a good year, the vaccine isn't 100% effective, and you can still get the flu. But you are much less likely to get it than anyone who is unvaccinated.)

While I've got your attention, let me answer one of the top 10 health questions of the year: "how long is the flu contagious?" According to the CDC,

  • the flu is most contagious in the first 3-4 days after becoming sick.

It continues to be contagious for up to a week, so if you have the flu, stay home! And make sure those around you avoid physical contact, as much as possible, and wash their hands frequently.

And while I'm at it, let's debunk a common myth:

  • No, you can't get the flu from the vaccine.

So if you've put off getting the flu vaccine, it's not too late! The season is in full swing, but if you get the vaccine today, you'll likely have excellent protection for the rest of the season. Go get it.