$3.7 million to study quack medicine at a leading cancer center

Sometimes I'm not sure whether the best response to pseudoscience is to ignore it, or to patiently try to explain why it's wrong, or to get mad.

This week I'm mad.

My anger and frustration was triggered by a tweet from Memorial Sloan-Kettering's Integrative Medicine account, shown here:
For those who don't know, Memorial Sloan-Kettering Cancer Center is one of the world's leading cancer centers, both for treatment and research. If you are diagnosed with cancer, MSK is one of the best places to go.

But not everything at MSK is world class. Unfortunately, they have an "integrative medicine" center that offers a mixture of therapies ranging from helpful to benign to useless. One of their biggest activities is acupuncture, which they claim offers a wide range of benefits to cancer patients.

The MSK tweet shown here was boasting about a new, $3.7 million study funded by NIH to study the effect of acupuncture on pain that cancer patients experience from chemotherapy and bone-marrow transplants.

Here's why I'm mad: cancer patients are extremely vulnerable, often suffering the most frightening and difficult experience of their lives. They are completely dependent on medical experts to help them. When a place like MSK suggests a treatment, patients take it very seriously–as they should. But they really have no choice: a cancer patient cannot easily look for a second opinion, or switch hospitals or doctors. Even if they have the money (and cancer treatment is extremely expensive), switching hospitals might involve a long interruption with no treatment, during which they could die, and it might also involve traveling far from their home.

Offering these patients ineffective treatments based on pseudoscience–and make no mistake, that's what acupuncture is–is immoral. Now, I strongly suspect that the MSK's "integrative medicine" doctors sincerely believe that acupuncture works. Their director, Jun Mao, is clearly a true believer, as explained in this profile of him on the MSK website. But that doesn't make it okay.

I've written about acupuncture many times before (here, here, here, and here, for example), but let me explain afresh why it is nonsense.

Acupuncture is based on a pre-scientific notion, invented long before humans understood physiology, chemistry, neurology, or even basic physics, which posits that a mysterious life force, called "qi," flows through the body on energy lines called meridians. As explained in this article by MSK's Jun Mao:
"According to traditional Chinese medicine ... interruption or obstruction of qi was believed to make one vulnerable to illness. The insertion of needles at specific meridian acupoints was thought to regulate the flow of qi, thus producing therapeutic benefit."
Today we know that none of this exists. There is no qi, and there are no meridians. In that same article, Jun Mao continued by admitting that
"the ideas of qi and meridians are inconsistent with the modern understanding of human anatomy and physiology."
And yet this is what they offer to patients at MSK.

Just to be certain, I read one of the latest studies from MSK, published early this year, which claims to show that acupuncture relieves nausea, drowsiness, and lack of appetite in multiple myeloma patients who were going through stem cell transplants.

It's a mess: totally unconvincing, and a textbook case of p-hacking (or data dredging). The paper describes a very small study, with just 60 patients total, in which they measured literally dozens of possible outcomes: overall symptom score at 3 different time points, a different score at 3 time points, each of 13 symptoms individually, and more. I counted 24 different p-values, most of them not even close to significant, but they fixated on the 3 that reached statistical significance. The two groups of 30 patients weren't properly balanced: the sham acupuncture group started out with more severe symptoms according to their own scoring metric, and Figure 2 in the paper makes it pretty clear that there was no genuine difference in the effects of real versus sham acupuncture.

But they got it published (in a mediocre journal), so now they point to it as "proof" that acupuncture works for cancer patients. This study, bad as it is, appears to be the basis of the $3.7 million NIH grant that they're now going to use, they say, in "a larger study in 300 patients to confirm these previous findings."

And there you go: the goal of the new study, according to the scientists themselves, is not to see if the treatment works, but to confirm their pre-existing belief that acupuncture works. Or, as one scientist remarked on Twitter, "they already have a result in mind, the whole wording of this suggests that they EXPECT a positive outcome. How did this get funded exactly?"

Good question.

So I'm mad. I'm mad that NIH is spending millions of dollars on yet another study of a quack treatment (acupuncture) that should have been abandoned decades ago, but that persists because people make money off it. (And, as others have explained in detail, acupuncture is actually a huge scam that former Chinese dictator Mao Zedong foisted on his own people, because he couldn't afford to offer them real medicine. For a good exposé of Chairman Mao's scam, see this 2013 Slate piece.)

But I'm even more upset that doctors at one of the world's leading cancer centers are telling desperately ill patients, who trust them with their lives, that sticking needles into their bodies at bogus "acupuncture points" will relieve the pain and nausea of chemotherapy, or help them with other symptoms of cancer. I'm willing to bet that most MSK doctors don't believe any of this, but they don't want to invest the time or energy to try to stop it.

(I am somewhat reassured by the fact that MSK's Twitter account has nearly 75,000 followers, while it's integrative medicine Twitter account has just 110.)

Or perhaps they are "shruggies": doctors who don't believe in nonsense, but figure it's probably harmless so they don't really object. To them I suggest this: read Dr. Val Jones's account of how she too was a shruggie, until she realized that pseudoscience causes real harm.

And finally, let me point to this study in JAMA Oncology from last year, by doctors from Yale, which looked at the use of so-called complementary therapies among cancer patients. They found that
"Patients who received complementary medicine were more likely to refuse other conventional cancer treatment, and had a higher risk of death than no complementary medicine."
And also see this 2017 study from the Journal of the National Cancer Institute, which found that patients who used alternative medicine were 2.5 times more likely to die than patients who stuck to modern medical treatments.

That's right, Memorial Sloan-Kettering: patients who use non-traditional therapies are twice as likely to die. That's why I'm mad. This is not okay.

Gluten-free diet has no effect on autism, according to a new study

Parents of autistic children are constantly seeking new treatments. Autism spectrum disorder, or ASD, is a developmental disorder that causes problems in social interaction and communication, ranging from mild to severe. It's an extremely challenging condition for parents, and as of today there is no cure.

However, there are plenty of websites that offer treatments for autism, many of them unproven. One of the more common claims is that autistic children will benefit from a gluten-free, casein-free diet. There has been some weakly supportive evidence for this idea, such as this 2012 report from Penn State, but that study was based entirely on interviews with parents. Interviews are notoriously unreliable for scientific data collection.

Perhaps because so few effective treatments are available, many parents of autistic children have tried gluten-free diets, in the hope that perhaps they might work. (One can find entire books dedicated to this diet.)

The science behind the idea that gluten or casein causes (or worsens) autism has always been sketchy. The push for diet-based treatments has its origins in the anti-vaccine movement, beginning with the fraudulent 1998 study (eventually retracted) in The Lancet led by Andrew Wakefield, a former gastroenterologist who lost his medical license after his fraud was discovered. Wakefield claimed that autism was caused by a "leaky gut," which somehow allowed vaccine particles to make their way to the brain, which in turn caused autism. That chain of events was never supported by scientific evidence. Nonetheless, it morphed into the hypothesis that gluten (or casein) somehow leaks out of the intestines and causes some symptoms of autism. There's no evidence to support that either.

(Despite losing his medical license, Wakefield has become a leading voice in the anti-vaccine movement, making speeches and even movies to try to convince parents not to vaccinate their kids. Many journalists and scientists have written about him and the harm that he's done, but that's not my topic today.)

Another hypothesis, according to WebMD, is that autistic children have some kind of allergic reaction to gluten. There is no good evidence for this either.

Surprisingly, virtually no good studies have asked the question, do gluten-containing foods actually cause the symptoms of autism? Now we have a carefully-done study that provides an answer.

The new study, just published in the Journal of Autism and Developmental Disorders, is the first randomized, well-controlled study of gluten-free diets in children with autism. The scientists, all from the University of Warsaw, Poland, recruited 66 children, and assigned half of them at random to a gluten-free diet. The other half were given a normal diet, with at least one meal a day containing gluten, for 6 months. The children ranged from 3 to 5 years old. After 6 months, the scientists evaluated all children using multiple standardized measurements of autistic behavior.

The results were very clear: the study found no difference between the diets. None of the core symptoms of ASD were different between children in the two groups, and there were no differences in gastrointestinal symptoms either. As the study itself stated:
"There is no evidence either against or in favor of gluten avoidance for managing symptoms of ASD in children." 
This study should put to rest all of the claims that a gluten-free diet can somehow improve the symptoms of autism. It doesn't provide an easy answer for parents, and the medical community still needs to do much more work to find better treatments. But let's hope that parents get the message: don't feed your autistic child a restricted diet.

What's the proper amount of dog for optimal health?

Humans have had dogs as companions for thousands of years. Over that time, dogs have evolved to become ever-better companions, as we humans selectively bred them for traits that we like, such as friendliness and loyalty.

Dog owners already know that owning a dog reduces stress. But it turns out that the health benefits of owning a dog go quite a bit further: two new studies published this month in the journal Circulation both found that owning a dog reduces your risk of dying.

The first study, by Carolyn Kramer and colleagues at the University of Toronto, reviewed ten other studies dating back more than 50 years, covering 3.8 million people. They compared dog owners to non-owners and found that dog owners had a 24% lower risk of dying, from any cause, over a 10-year period. The benefit was even greater for people who'd suffered a heart attack: those who had a dog at home after their heart attack had a 65% lower risk of dying.

The second study, by Tove Fall and colleagues at Uppsala University, focused on the benefits of owning a dog for people who have had a heart attack or a stroke. They used the Swedish National Patient Register to identify 335,000 patients who'd suffered one of these events between 2000 and 2012, about 5% of whom were dog owners. They found even greater benefits than the first study: among people who'd had a heart attack, their risk of dying was 33% lower if they owned a dog as compared to people who lived alone. The benefits were smaller but still significant for people who lived with a companion (a spouse or a child): they still had a 15% lower risk of dying if they also owned a dog. For those who'd had a stroke, the risk of dying for dog owners was 27% lower than for people who lived along, and 12% lower than for people who lived with a companion but didn't have a dog. This study measured the risk over a 4-5 year followup period.

These studies are consistent with many other scientific reports, stretching back decades. They're all consistent, and they all point in the same direction: dog ownership is good for your health. In fact, back in 2013 the American Heart Association issued an official statement on "Pet Ownership and Cardiovascular Risk" with this recommendation:
"Pet ownership, particularly dog ownership, may be reasonable for reduction in cardiovascular disease risk."
However, because the evidence was not very strong, the AHA also advised that people shouldn't get a pet "for the primary purpose of reducing CVD risk." In other words, don't get a dog if you don't want one. As every dog owner knows, owning a dog is much more trouble than simply taking a daily pill.

The new studies strengthen the existing evidence for the health benefits of owning a dog. In an accompanying editorial in Circulation, Dhruv Kazi from Harvard Medical School asks a critical question: is the association between dog ownership and reduced mortality just a correlation, or is it causal? He points out that studies have shown that dog ownership reduces blood pressure and other signs of stress, and that dog owners tend to get outside and walk more (with their dogs). Thus it's very plausible, medically speaking, that dog ownership is good for you. For these and other reasons, Kazi concludes that
"the association between dog ownership and improved survival is real, and is likely at least partially causal."
One final question is still nagging at me, though. Now that we know that dog ownership is good for your health, what's the optimal dose? Would it be even healthier to own two dogs rather than one? And what if we throw in a cat, does that strengthen or reduce the effect? Finally, is it healthier to own a larger dog, or is a small one just as good?

Clearly, more research is needed.

[Note: the author discloses that he owns a rescue dog, a rather small terrier.]

Can You Improve Your Memory With A Jellyfish Protein?

Some colleagues of mine recently asked me about Prevagen, a supplement that is being advertised heavily on television as a memory booster. It's everywhere, they said–but what is it? And does it work?

Both questions are pretty easy to answer. On the first question, the TL;DR version is that Prevagen's primary ingredient is a protein called apoaequorin, which is found in a species of jellyfish that glows in the dark. These jellies produce two proteins, apoaequorin and green fluorescent protein (GFP), that help them fluoresce. It’s an amazing biological system, and the three scientists who discovered and developed the chemistry of GFP were awarded the 2008 Nobel Prize in chemistry.

Cool science! But what does this have to do with human memory? Not much, it turns out.

First let's examine what Prevagen's manufacturers, Quincy Bioscience, say about it. Their website claims that:
"Prevagen Improves Memory* 
Prevagen is a dietary supplement that has been clinically shown to help with mild memory loss associated with aging.* Prevagen is formulated with apoaequorin, which is safe and uniquely supports brain function.*"
Sounds pretty clear, right? But note the asterisks by each of these claims: if you scroll all the way down (or read the small print on their packages), you'll find out that:
"*These statements have not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure or prevent any disease."
You may recognize this language: it's what all supplement manufacturers use to avoid getting in trouble with the FDA. It means, essentially, that the government hasn't approved Prevagen to treat anything, including memory loss.

Despite Quincy’s claims, I see no reason why eating this protein would have any effect at all on brain function. First of all, it’s not even a human protein, so it's unlikely to work in humans. Second, even if it did work in humans, eating it would not deliver it to our brains, because it would be almost certainly be broken down in the stomach. And third, the connection between any protein and memory is very complex, so simply having more of a protein is very, very unlikely to improve memory.

Quincy's website points to a single study that they themselves conducted, which they argue showed benefits for people with mild memory impairment. However, others have pointed out that the experiment (which was never published in a scientific journal) didn't show any such thing: overall there was no difference between people who took Prevagen and those who took a placebo, but the manufacturer's did some p-hacking to extract a subgroup that appeared to get a benefit. As Dr. Harriett Hall and others have pointed out, this kind of p-hacking is bogus.

And what about my observation that the jellyfish protein will simply be digested in the stomach, and never make it to the brain? It turns out that the company itself admits that I'm right. On their website, they have a "research" page pointing to several safety studies, designed to show that Prevagen won't cause an immune reaction. One of these studies explains that
"Apoaequorin is easily digested by pepsin."
Pepsin is the chief digestive enzyme in your stomach. So Prevagen's main ingredient never gets beyond the stomach, which is why it's probably quite safe. (Joe Schwarcz at McGill University recently made the same point.)

Back in 2015, I asked Ted Dawson, the Abramson Professor of Neurodegenerative Diseases at Johns Hopkins School of Medicine, what he thought of Prevagen’s claims.
“It is hard to evaluate Prevagen as to the best of my knowledge there is no peer-reviewed publication on its use in memory and cognition,” said Dawson. “The study cited on the company’s web site is a small short study, raising concerns about the validity of the claims.”
Finally, a word to those who are still tempted to try Prevagen: it isn't cheap. Their website charges $75 for a bottle of 60 pills, each containing 10 mg of apoaequorin, or $90 for 30 pills of the "Professional formula," which contain 40 mg. (Note that there's no evidence that taking a higher dose will work any better.)

The FTC sued Quincy Bioscience in 2017 for deceptive advertising, arguing that claims that Prevagen boosts memory are false, and that claims it can get into the brain are also false. Just a few months ago, a judge ruled that the case can proceed. Meanwhile, though, the advertising and sales of Prevagen continue. The FTC case states that Quincy sold $165 million worth of Prevagen in the U.S. from 2007 to 2015.

So the bottom line is: jellyfish proteins are very cool, but eating them won't improve your memory. If you're interested in brain food, perhaps you just eat more fish, which might actually work.

(Note: I wrote about Prevagen in 2015, and some elements of this article are based on my earlier one.)

How to live longer: eat less red meat and more nuts. And be optimistic.

Will eating more red meat kill you? It just might–but there's something you can do about it.

Red meat has long been implicated in some very bad health outcomes, especially heart disease and colon cancer. And yet people keep eating it, for the simple reason that it tastes good (though not to everyone, of course).

A recent study out of Harvard and Fudan University, published in the BMJ, put some numbers on the risk of eating red meat. The authors used two very large studies, one of men and one of women, to ask a simple question: does eating red meat make it more likely that a person will die?

The answer is yes: in the study, men and women who ate more red meat–half a serving more per day–had about a 9% greater risk of death (from any cause), over the course of an 8-year followup period. Processed meats were worse: an increase of half a serving per day led to a 17% higher risk of death.

In case you're wondering, "processed" meats are foods like hot dogs, bacon, and sausages. And if half a serving per day sounds like a lot, it's not: the scientists defined a serving as just 85 grams of beef, pork, or lamb, or a single 45-gram hot dog, or 2 slices of bacon. By comparison, a quarter-pound hamburger is 115 grams. So an extra half-serving isn't very much.

(But smoked salmon lovers needn't worry: as I wrote back in August, smoked salmon is not processed meat. It's fish, which is far healthier than red meat.)

Can you lower our risk of death by reducing red meat consumption? The study looked at this question too, and the answer was, again, yes: if you replace one serving of red meat per day with whole grains, vegetables, or nuts, your risk of dying goes down, as much as 19%.

Even better is to replace one serving per day of processed meat (bacon, sausages, etc) with nuts:
"A decrease in processed meat and a simultaneous increase in whole grains, vegetables, or other protein sources was even more strongly associated with lower total mortality, with the largest reductions in risk seen with increases in nuts."
That led to a 26% reduction in the risk of death over eight years. The authors found similar results when they looked at benefits over different time spans.

The conclusion is pretty clear: replace some of the red meat in your diet with vegetables, whole grains, or nuts, and you'll probably live longer.

There's another thing you can do to avoid dying of a heart attack: be optimistic. In a completely independent study published just last week in JAMA, scientists conducted a meta-analysis of 15 earlier studies, asking whether optimism is associated with better heart health. They found that over a 14-year period, optimistic people had a 35% lower (relative) risk of cardiovascular problems and 14% lower (relative) risk of dying than pessimistic people.

There are many caveats about this study–first, it's a meta-analysis, meaning that it combines the data from many other studies. That can lead to biases, but the authors acknowledged this problem and seem to have been pretty careful to avoid it. Second, how do you measure optimism? Turns out there's a questionnaire for that, dating back 25 years, and it appears to be reliable and reproducible. Most of the studies used the same method for measuring optimism, and the benefits were quite consistent across all the studies. And it's possible that sicker people are more pessimistic, so the cause-and-effect could go either way here.

So there you have it: cut down on red meat, eat nuts instead, and stay positive. You'll live longer.





Is this drug combo a true fountain of youth?

Is rejuvenation of the thymus a key to restoring youth? Maybe it is.

A very surprising result appeared last week in a journal called Aging Cell. A team of scientists published the first results of a study that showed, in a small group of older men, that some signs of aging could be reversed with a 3-drug combination.

Not just slowed down. Reversed.

If this holds up, it could literally be life-changing for millions of people. I was initially very skeptical, having read countless claims of anti-aging treatments over the years, virtually all of which turned out to be wrong. Anti-aging treatments are a huge commercial market, full of misleading promises and vague claims. Youth-restoring skin treatments (which don't work) are a particular favorite of cosmetics companies.

But this new study is different. The scientists decided to explore whether recombinant human growth hormone (rhGH) could help to restore the thymus gland. Your thymus is in the middle of your chest, and it is part of your immune system, helping to produce the T cells that fight off infections. As we age, the thymus shrinks and starts to get "clogged with fat," as a news story in Nature put it. Hints that rhGH could help restore the thymus goes back decades, but it had never before been tested in humans.

The scientists leading the study added two more drugs, DHEA and metformin, because rhGH does carry some increased risk of diabetes. Both of these drugs help to prevent diabetes, and both might also have anti-aging benefits, although neither of them is known to affect the thymus.

Amazingly, in 7 out of 9 men in the study (it was a very small study), the thymus showed clear signs of aging reversal, with new thymus tissue replacing fat. The side effects of rhGH are very mild, and none of the men in this study had any significant problems from it or from the other two drugs.

Equally remarkable was another, unanticipated, sign of anti-aging. The study measured "epigenetic age" in all the subjects by four different methods. "Epigenetic age" refers to markers at the cellular level that change as we age, and as the study explains:
"Although epigenetic age does not measure all features of aging and is not synonymous with aging itself, it is the most accurate measure of biological age and age‐related disease risk available today."
After 9 months of treatment, the epigenetic age of the men in this study was 2.5 years younger. The treatment didn't just slow aging–it reversed it. The effects persisted in a followup 6 months later: one and a half years after the study began, the men's epigenetic age was 1.5 years younger than at the beginning. This is truly remarkable.

Any study has limitations, so I should mention a couple here. First, the study was very small, just 9 men, but the effects were strong and significant. Second, the lead scientist of the study, Gregory Fahy, is the co-founder of a company called Intervene Immune that plans to market anti-aging treatments. The authors also include scientists from Stanford, UCLA, and the University of British Columbia.

A few years ago I wrote about another drug combination, dasatinib and quercetin, which showed great promise in reversing aging, but only in mice. We're still waiting to hear more about that treatment, although a test in humans showed some promise earlier this year.

The new 3-drug combination is the most promising I've seen yet. The possible benefits are enormous: as the study points out, they include lower risks for at least 8 types of cancer, heart disease, and stroke. Unlike many other anti-aging treatments, this one has genuine plausibility, and the effects on the thymus can be measured almost immediately. Let's hope this one works out; we'll all be better off if it does.

College football season is starting up. Why do universities support a sport that harms their student athletes?

For those of us in academia, September means a new school year, and all of the excitement and energy that students bring as they return to campus. Strolling around, you can feel the energy in the air.

September is also the beginning of the college football season (in the U.S.). For many students, alumni, and other fans, watching the game each week is one more fall activity they look forward to.

But now, thanks to a rapidly growing body of new research, we know that football can severely harm and even kill its players. Not right away, but years later, through a brain disease called CTE, or chronic traumatic encephalopathy. This is a frightening disorder that gradually destroys brain cells, causing memory loss, confusion, impaired judgment, and progressive dementia. Many former players die very young, in their 40s or 50s, after first suffering for years.

CTE is caused by repeated blows to the head, events that are common to football. It has grown worse in recent decades as the players have gotten bigger and stronger. Improvements in helmet technology haven't helped, and they might even have made CTE even worse, because the helmets allowed players (by their own admission) to use their heads as battering rams.

Two years ago now, a large medical study of football players' brains showed that an appallingly high percentage of those players had CTE. In that study, Boston University scientists led by Jesse Mez and Ann McKee found CTE in the brains of 110 out of 111 former NFL players (99%), and 48 out of 53 college players (91%).

As the BU scientists themselves pointed out, the former players and their families may have suspected something was wrong, and that may have motivated them to participate in the study. Thus the extremely high proportion of deceased players showing CTE in this study is certainly an overestimate. But as I wrote at the time:
"is it okay to ask young men to play football if the risk of permanent brain damage is only 50%? What if it's just 10 or 20%? Is that okay? Is football that important?"
Clearly, the answer should be no. University presidents are constantly, even obsessively, worrying about the safety of their students. Campuses have many programs in place to protect students from crime, from sexual harrassment, from emotional distress, and more. And yet every fall, they willingly–no, enthusiastically–subject the 100 or so students on their football teams to a serious risk of lifelong, life-threatening brain damage. This simply should end.

For an especially poignant story, watch this short video about Greg Ploetz, who played on the 1969 national championship football team at the University of Texas, and who died in 2015 after years of worsening dementia:
As his daughter says in the video,
"If [today's football players] knew what he went through, and what we went through as a family, there's no way that people would decide to keep playing." 
Perhaps universities could take a cue from former Baltimore Ravens player John Urschel, widely considered the smartest player in the NFL, who was pursuing a Ph.D. in mathematics at MIT while simultaneously playing pro football. Two years ago, Urschel quit, because he was worried that brain damage would destroy his ability to think clearly. And just one week ago, Indianapolis Colts' star quarterback Andrew Luck retired early because football had "wrecked his body and stolen his joy."

Brain damage may be happening to much younger players too. A study from the University of Washington last year found that 5% of youth football players aged 5-14 had experienced concussions each season. Three years ago, a mother sued the Pop Warner youth football organization after her son committed suicide at age 25. An autopsy showed that he had CTE, and the mother argued that his brain damaged was caused by his years playing youth football. The Pop Warner organization settled the suit for close to $2 million, but other lawsuits have been filed since.

As I and others have written, football and its promise of big-money television contracts has corrupted our universities. While universities build ever-bigger football stadiums and pay coaches exhorbitant salaries, they force the players to play for free. Now we know that players face a much more direct threat: long-term brain damage.

Let me ask university presidents this question as bluntly as I can: how much brain damage is acceptable for your football players? If your answer is "none," then it's time to get football out of our universities.