Gluten-free diet has no effect on autism, according to a new study

Parents of autistic children are constantly seeking new treatments. Autism spectrum disorder, or ASD, is a developmental disorder that causes problems in social interaction and communication, ranging from mild to severe. It's an extremely challenging condition for parents, and as of today there is no cure.

However, there are plenty of websites that offer treatments for autism, many of them unproven. One of the more common claims is that autistic children will benefit from a gluten-free, casein-free diet. There has been some weakly supportive evidence for this idea, such as this 2012 report from Penn State, but that study was based entirely on interviews with parents. Interviews are notoriously unreliable for scientific data collection.

Perhaps because so few effective treatments are available, many parents of autistic children have tried gluten-free diets, in the hope that perhaps they might work. (One can find entire books dedicated to this diet.)

The science behind the idea that gluten or casein causes (or worsens) autism has always been sketchy. The push for diet-based treatments has its origins in the anti-vaccine movement, beginning with the fraudulent 1998 study (eventually retracted) in The Lancet led by Andrew Wakefield, a former gastroenterologist who lost his medical license after his fraud was discovered. Wakefield claimed that autism was caused by a "leaky gut," which somehow allowed vaccine particles to make their way to the brain, which in turn caused autism. That chain of events was never supported by scientific evidence. Nonetheless, it morphed into the hypothesis that gluten (or casein) somehow leaks out of the intestines and causes some symptoms of autism. There's no evidence to support that either.

(Despite losing his medical license, Wakefield has become a leading voice in the anti-vaccine movement, making speeches and even movies to try to convince parents not to vaccinate their kids. Many journalists and scientists have written about him and the harm that he's done, but that's not my topic today.)

Another hypothesis, according to WebMD, is that autistic children have some kind of allergic reaction to gluten. There is no good evidence for this either.

Surprisingly, virtually no good studies have asked the question, do gluten-containing foods actually cause the symptoms of autism? Now we have a carefully-done study that provides an answer.

The new study, just published in the Journal of Autism and Developmental Disorders, is the first randomized, well-controlled study of gluten-free diets in children with autism. The scientists, all from the University of Warsaw, Poland, recruited 66 children, and assigned half of them at random to a gluten-free diet. The other half were given a normal diet, with at least one meal a day containing gluten, for 6 months. The children ranged from 3 to 5 years old. After 6 months, the scientists evaluated all children using multiple standardized measurements of autistic behavior.

The results were very clear: the study found no difference between the diets. None of the core symptoms of ASD were different between children in the two groups, and there were no differences in gastrointestinal symptoms either. As the study itself stated:
"There is no evidence either against or in favor of gluten avoidance for managing symptoms of ASD in children." 
This study should put to rest all of the claims that a gluten-free diet can somehow improve the symptoms of autism. It doesn't provide an easy answer for parents, and the medical community still needs to do much more work to find better treatments. But let's hope that parents get the message: don't feed your autistic child a restricted diet.

What's the proper amount of dog for optimal health?

Humans have had dogs as companions for thousands of years. Over that time, dogs have evolved to become ever-better companions, as we humans selectively bred them for traits that we like, such as friendliness and loyalty.

Dog owners already know that owning a dog reduces stress. But it turns out that the health benefits of owning a dog go quite a bit further: two new studies published this month in the journal Circulation both found that owning a dog reduces your risk of dying.

The first study, by Carolyn Kramer and colleagues at the University of Toronto, reviewed ten other studies dating back more than 50 years, covering 3.8 million people. They compared dog owners to non-owners and found that dog owners had a 24% lower risk of dying, from any cause, over a 10-year period. The benefit was even greater for people who'd suffered a heart attack: those who had a dog at home after their heart attack had a 65% lower risk of dying.

The second study, by Tove Fall and colleagues at Uppsala University, focused on the benefits of owning a dog for people who have had a heart attack or a stroke. They used the Swedish National Patient Register to identify 335,000 patients who'd suffered one of these events between 2000 and 2012, about 5% of whom were dog owners. They found even greater benefits than the first study: among people who'd had a heart attack, their risk of dying was 33% lower if they owned a dog as compared to people who lived alone. The benefits were smaller but still significant for people who lived with a companion (a spouse or a child): they still had a 15% lower risk of dying if they also owned a dog. For those who'd had a stroke, the risk of dying for dog owners was 27% lower than for people who lived along, and 12% lower than for people who lived with a companion but didn't have a dog. This study measured the risk over a 4-5 year followup period.

These studies are consistent with many other scientific reports, stretching back decades. They're all consistent, and they all point in the same direction: dog ownership is good for your health. In fact, back in 2013 the American Heart Association issued an official statement on "Pet Ownership and Cardiovascular Risk" with this recommendation:
"Pet ownership, particularly dog ownership, may be reasonable for reduction in cardiovascular disease risk."
However, because the evidence was not very strong, the AHA also advised that people shouldn't get a pet "for the primary purpose of reducing CVD risk." In other words, don't get a dog if you don't want one. As every dog owner knows, owning a dog is much more trouble than simply taking a daily pill.

The new studies strengthen the existing evidence for the health benefits of owning a dog. In an accompanying editorial in Circulation, Dhruv Kazi from Harvard Medical School asks a critical question: is the association between dog ownership and reduced mortality just a correlation, or is it causal? He points out that studies have shown that dog ownership reduces blood pressure and other signs of stress, and that dog owners tend to get outside and walk more (with their dogs). Thus it's very plausible, medically speaking, that dog ownership is good for you. For these and other reasons, Kazi concludes that
"the association between dog ownership and improved survival is real, and is likely at least partially causal."
One final question is still nagging at me, though. Now that we know that dog ownership is good for your health, what's the optimal dose? Would it be even healthier to own two dogs rather than one? And what if we throw in a cat, does that strengthen or reduce the effect? Finally, is it healthier to own a larger dog, or is a small one just as good?

Clearly, more research is needed.

[Note: the author discloses that he owns a rescue dog, a rather small terrier.]

Can You Improve Your Memory With A Jellyfish Protein?

Some colleagues of mine recently asked me about Prevagen, a supplement that is being advertised heavily on television as a memory booster. It's everywhere, they said–but what is it? And does it work?

Both questions are pretty easy to answer. On the first question, the TL;DR version is that Prevagen's primary ingredient is a protein called apoaequorin, which is found in a species of jellyfish that glows in the dark. These jellies produce two proteins, apoaequorin and green fluorescent protein (GFP), that help them fluoresce. It’s an amazing biological system, and the three scientists who discovered and developed the chemistry of GFP were awarded the 2008 Nobel Prize in chemistry.

Cool science! But what does this have to do with human memory? Not much, it turns out.

First let's examine what Prevagen's manufacturers, Quincy Bioscience, say about it. Their website claims that:
"Prevagen Improves Memory* 
Prevagen is a dietary supplement that has been clinically shown to help with mild memory loss associated with aging.* Prevagen is formulated with apoaequorin, which is safe and uniquely supports brain function.*"
Sounds pretty clear, right? But note the asterisks by each of these claims: if you scroll all the way down (or read the small print on their packages), you'll find out that:
"*These statements have not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure or prevent any disease."
You may recognize this language: it's what all supplement manufacturers use to avoid getting in trouble with the FDA. It means, essentially, that the government hasn't approved Prevagen to treat anything, including memory loss.

Despite Quincy’s claims, I see no reason why eating this protein would have any effect at all on brain function. First of all, it’s not even a human protein, so it's unlikely to work in humans. Second, even if it did work in humans, eating it would not deliver it to our brains, because it would be almost certainly be broken down in the stomach. And third, the connection between any protein and memory is very complex, so simply having more of a protein is very, very unlikely to improve memory.

Quincy's website points to a single study that they themselves conducted, which they argue showed benefits for people with mild memory impairment. However, others have pointed out that the experiment (which was never published in a scientific journal) didn't show any such thing: overall there was no difference between people who took Prevagen and those who took a placebo, but the manufacturer's did some p-hacking to extract a subgroup that appeared to get a benefit. As Dr. Harriett Hall and others have pointed out, this kind of p-hacking is bogus.

And what about my observation that the jellyfish protein will simply be digested in the stomach, and never make it to the brain? It turns out that the company itself admits that I'm right. On their website, they have a "research" page pointing to several safety studies, designed to show that Prevagen won't cause an immune reaction. One of these studies explains that
"Apoaequorin is easily digested by pepsin."
Pepsin is the chief digestive enzyme in your stomach. So Prevagen's main ingredient never gets beyond the stomach, which is why it's probably quite safe. (Joe Schwarcz at McGill University recently made the same point.)

Back in 2015, I asked Ted Dawson, the Abramson Professor of Neurodegenerative Diseases at Johns Hopkins School of Medicine, what he thought of Prevagen’s claims.
“It is hard to evaluate Prevagen as to the best of my knowledge there is no peer-reviewed publication on its use in memory and cognition,” said Dawson. “The study cited on the company’s web site is a small short study, raising concerns about the validity of the claims.”
Finally, a word to those who are still tempted to try Prevagen: it isn't cheap. Their website charges $75 for a bottle of 60 pills, each containing 10 mg of apoaequorin, or $90 for 30 pills of the "Professional formula," which contain 40 mg. (Note that there's no evidence that taking a higher dose will work any better.)

The FTC sued Quincy Bioscience in 2017 for deceptive advertising, arguing that claims that Prevagen boosts memory are false, and that claims it can get into the brain are also false. Just a few months ago, a judge ruled that the case can proceed. Meanwhile, though, the advertising and sales of Prevagen continue. The FTC case states that Quincy sold $165 million worth of Prevagen in the U.S. from 2007 to 2015.

So the bottom line is: jellyfish proteins are very cool, but eating them won't improve your memory. If you're interested in brain food, perhaps you just eat more fish, which might actually work.

(Note: I wrote about Prevagen in 2015, and some elements of this article are based on my earlier one.)

How to live longer: eat less red meat and more nuts. And be optimistic.

Will eating more red meat kill you? It just might–but there's something you can do about it.

Red meat has long been implicated in some very bad health outcomes, especially heart disease and colon cancer. And yet people keep eating it, for the simple reason that it tastes good (though not to everyone, of course).

A recent study out of Harvard and Fudan University, published in the BMJ, put some numbers on the risk of eating red meat. The authors used two very large studies, one of men and one of women, to ask a simple question: does eating red meat make it more likely that a person will die?

The answer is yes: in the study, men and women who ate more red meat–half a serving more per day–had about a 9% greater risk of death (from any cause), over the course of an 8-year followup period. Processed meats were worse: an increase of half a serving per day led to a 17% higher risk of death.

In case you're wondering, "processed" meats are foods like hot dogs, bacon, and sausages. And if half a serving per day sounds like a lot, it's not: the scientists defined a serving as just 85 grams of beef, pork, or lamb, or a single 45-gram hot dog, or 2 slices of bacon. By comparison, a quarter-pound hamburger is 115 grams. So an extra half-serving isn't very much.

(But smoked salmon lovers needn't worry: as I wrote back in August, smoked salmon is not processed meat. It's fish, which is far healthier than red meat.)

Can you lower our risk of death by reducing red meat consumption? The study looked at this question too, and the answer was, again, yes: if you replace one serving of red meat per day with whole grains, vegetables, or nuts, your risk of dying goes down, as much as 19%.

Even better is to replace one serving per day of processed meat (bacon, sausages, etc) with nuts:
"A decrease in processed meat and a simultaneous increase in whole grains, vegetables, or other protein sources was even more strongly associated with lower total mortality, with the largest reductions in risk seen with increases in nuts."
That led to a 26% reduction in the risk of death over eight years. The authors found similar results when they looked at benefits over different time spans.

The conclusion is pretty clear: replace some of the red meat in your diet with vegetables, whole grains, or nuts, and you'll probably live longer.

There's another thing you can do to avoid dying of a heart attack: be optimistic. In a completely independent study published just last week in JAMA, scientists conducted a meta-analysis of 15 earlier studies, asking whether optimism is associated with better heart health. They found that over a 14-year period, optimistic people had a 35% lower (relative) risk of cardiovascular problems and 14% lower (relative) risk of dying than pessimistic people.

There are many caveats about this study–first, it's a meta-analysis, meaning that it combines the data from many other studies. That can lead to biases, but the authors acknowledged this problem and seem to have been pretty careful to avoid it. Second, how do you measure optimism? Turns out there's a questionnaire for that, dating back 25 years, and it appears to be reliable and reproducible. Most of the studies used the same method for measuring optimism, and the benefits were quite consistent across all the studies. And it's possible that sicker people are more pessimistic, so the cause-and-effect could go either way here.

So there you have it: cut down on red meat, eat nuts instead, and stay positive. You'll live longer.





Is this drug combo a true fountain of youth?

Is rejuvenation of the thymus a key to restoring youth? Maybe it is.

A very surprising result appeared last week in a journal called Aging Cell. A team of scientists published the first results of a study that showed, in a small group of older men, that some signs of aging could be reversed with a 3-drug combination.

Not just slowed down. Reversed.

If this holds up, it could literally be life-changing for millions of people. I was initially very skeptical, having read countless claims of anti-aging treatments over the years, virtually all of which turned out to be wrong. Anti-aging treatments are a huge commercial market, full of misleading promises and vague claims. Youth-restoring skin treatments (which don't work) are a particular favorite of cosmetics companies.

But this new study is different. The scientists decided to explore whether recombinant human growth hormone (rhGH) could help to restore the thymus gland. Your thymus is in the middle of your chest, and it is part of your immune system, helping to produce the T cells that fight off infections. As we age, the thymus shrinks and starts to get "clogged with fat," as a news story in Nature put it. Hints that rhGH could help restore the thymus goes back decades, but it had never before been tested in humans.

The scientists leading the study added two more drugs, DHEA and metformin, because rhGH does carry some increased risk of diabetes. Both of these drugs help to prevent diabetes, and both might also have anti-aging benefits, although neither of them is known to affect the thymus.

Amazingly, in 7 out of 9 men in the study (it was a very small study), the thymus showed clear signs of aging reversal, with new thymus tissue replacing fat. The side effects of rhGH are very mild, and none of the men in this study had any significant problems from it or from the other two drugs.

Equally remarkable was another, unanticipated, sign of anti-aging. The study measured "epigenetic age" in all the subjects by four different methods. "Epigenetic age" refers to markers at the cellular level that change as we age, and as the study explains:
"Although epigenetic age does not measure all features of aging and is not synonymous with aging itself, it is the most accurate measure of biological age and age‐related disease risk available today."
After 9 months of treatment, the epigenetic age of the men in this study was 2.5 years younger. The treatment didn't just slow aging–it reversed it. The effects persisted in a followup 6 months later: one and a half years after the study began, the men's epigenetic age was 1.5 years younger than at the beginning. This is truly remarkable.

Any study has limitations, so I should mention a couple here. First, the study was very small, just 9 men, but the effects were strong and significant. Second, the lead scientist of the study, Gregory Fahy, is the co-founder of a company called Intervene Immune that plans to market anti-aging treatments. The authors also include scientists from Stanford, UCLA, and the University of British Columbia.

A few years ago I wrote about another drug combination, dasatinib and quercetin, which showed great promise in reversing aging, but only in mice. We're still waiting to hear more about that treatment, although a test in humans showed some promise earlier this year.

The new 3-drug combination is the most promising I've seen yet. The possible benefits are enormous: as the study points out, they include lower risks for at least 8 types of cancer, heart disease, and stroke. Unlike many other anti-aging treatments, this one has genuine plausibility, and the effects on the thymus can be measured almost immediately. Let's hope this one works out; we'll all be better off if it does.

College football season is starting up. Why do universities support a sport that harms their student athletes?

For those of us in academia, September means a new school year, and all of the excitement and energy that students bring as they return to campus. Strolling around, you can feel the energy in the air.

September is also the beginning of the college football season (in the U.S.). For many students, alumni, and other fans, watching the game each week is one more fall activity they look forward to.

But now, thanks to a rapidly growing body of new research, we know that football can severely harm and even kill its players. Not right away, but years later, through a brain disease called CTE, or chronic traumatic encephalopathy. This is a frightening disorder that gradually destroys brain cells, causing memory loss, confusion, impaired judgment, and progressive dementia. Many former players die very young, in their 40s or 50s, after first suffering for years.

CTE is caused by repeated blows to the head, events that are common to football. It has grown worse in recent decades as the players have gotten bigger and stronger. Improvements in helmet technology haven't helped, and they might even have made CTE even worse, because the helmets allowed players (by their own admission) to use their heads as battering rams.

Two years ago now, a large medical study of football players' brains showed that an appallingly high percentage of those players had CTE. In that study, Boston University scientists led by Jesse Mez and Ann McKee found CTE in the brains of 110 out of 111 former NFL players (99%), and 48 out of 53 college players (91%).

As the BU scientists themselves pointed out, the former players and their families may have suspected something was wrong, and that may have motivated them to participate in the study. Thus the extremely high proportion of deceased players showing CTE in this study is certainly an overestimate. But as I wrote at the time:
"is it okay to ask young men to play football if the risk of permanent brain damage is only 50%? What if it's just 10 or 20%? Is that okay? Is football that important?"
Clearly, the answer should be no. University presidents are constantly, even obsessively, worrying about the safety of their students. Campuses have many programs in place to protect students from crime, from sexual harrassment, from emotional distress, and more. And yet every fall, they willingly–no, enthusiastically–subject the 100 or so students on their football teams to a serious risk of lifelong, life-threatening brain damage. This simply should end.

For an especially poignant story, watch this short video about Greg Ploetz, who played on the 1969 national championship football team at the University of Texas, and who died in 2015 after years of worsening dementia:
As his daughter says in the video,
"If [today's football players] knew what he went through, and what we went through as a family, there's no way that people would decide to keep playing." 
Perhaps universities could take a cue from former Baltimore Ravens player John Urschel, widely considered the smartest player in the NFL, who was pursuing a Ph.D. in mathematics at MIT while simultaneously playing pro football. Two years ago, Urschel quit, because he was worried that brain damage would destroy his ability to think clearly. And just one week ago, Indianapolis Colts' star quarterback Andrew Luck retired early because football had "wrecked his body and stolen his joy."

Brain damage may be happening to much younger players too. A study from the University of Washington last year found that 5% of youth football players aged 5-14 had experienced concussions each season. Three years ago, a mother sued the Pop Warner youth football organization after her son committed suicide at age 25. An autopsy showed that he had CTE, and the mother argued that his brain damaged was caused by his years playing youth football. The Pop Warner organization settled the suit for close to $2 million, but other lawsuits have been filed since.

As I and others have written, football and its promise of big-money television contracts has corrupted our universities. While universities build ever-bigger football stadiums and pay coaches exhorbitant salaries, they force the players to play for free. Now we know that players face a much more direct threat: long-term brain damage.

Let me ask university presidents this question as bluntly as I can: how much brain damage is acceptable for your football players? If your answer is "none," then it's time to get football out of our universities.

$545 million wasn't enough for chiropractors. Now they're lobbying Congress for much more.

Medicare currently wastes more than $545 million a year on chiropractors, as I revealed in an article last year. Wasteful as this is, it's not enough for chiropractors, who have successfully lobbied to have two Congressmen propose a new bill, HR3654, that will require Medicare to pay chiropractors for the full range of services that real doctors offer.

The American Chiropractic Association is practically rubbing its (metaphorical) hands together with glee. As they proudly point out, this endorsement of quackery is bipartisan: the bill is sponsored by two New York Congressman, Democrat Brian Higgins and Republican Tom Reed.

The idea of having chiropractors function as regular physicians is very troubling. Chiropractors do not receive proper medical training: they get their Doctor of Chiropractic (D.C.) degrees from one of a very small number of special chiropractic schools, which do not provide the full medical training that real medical schools do. Their curriculum also includes a heavy dose of pseudoscience, especially the training around subluxations.

For a detailed discussion of why chiropractors are not competent to be family physicians, I recommend this article by an experienced physician, Dr. Harriett Hall, titled "Chiropractors as family doctors? No way!" Dr. Hall goes into considerable detail explain why many of the medical practices of chiropractors are non-standard, not evidence-based, and possibly harmful. Or see this lengthy takedown of chiropractic subluxations by Sam Homola, a former chiropractor.

Many chiropractors are also anti-vaccine, unfortunately, as documented just two weeks ago in this article by attorney Jann Bellamy. Among other things, Bellamy points out that a major chiropractic conference this fall will feature a keynote talk by anti-vaccine activist Robert Kennedy Jr. (about whom I've written before).

Even more alarming, as I've explained before, is that chiropractic neck manipulation has been shown to carry a small but real risk of stroke, because it can create a tear in your vertebral arteries. For example, this report from 2016 documented a case of cerebral hemorrhage apparently caused by chiropractic manipulation. The patient in that case was a 75-year-old woman, which puts her squarely in the class of patients eligible for Medicare.

And to those chiropractors who've read this far: I'm sorry that you were hoodwinked into spending 3-4 years in a chiropractic school, paying nearly $200,000 in tuition and fees, with the promise that you'd be a legitimate medical professional. You were scammed, and I'm sorry about that. And I understand that most (perhaps all) chiropractors want to help their patients. The problem is, the training offered by chiropractic colleges is far short of a proper medical degree.

If the chiropractors' lobbying association get its way, this $545 million (annually) in wasted Medicare dollars will soon become a far higher amount–to the detriment of patients. The bill will allow chiropractors to bill Medicare for pretty much any service that a bona fide physician offers.

It's also worth noting that in 2018, Medicare's Inspector General issued a report titled "Medicare needs better controls to prevent waste, fraud, and abuse related to chiropractic services," which revealed that almost half of Medicare spending on chiropractic care from 2010-2015, between $257 million and $304 million per year, was likely wasted or fraudulent. One wouldn't think this is a time to expand Medicare's coverage of chiropractic.

Congress, don't be fooled by arguments that this proposed new law will lower medical costs, or give patients what they need: it won't. Instead, it will dramatically increase the amount of funds wasted on ineffective treatments. The U.S. does need a better health care system, but this bill would be a big step in the wrong direction.