A new kind of fasting provides significant immune system benefits

I've written about fasting and its effects on health before. Six years ago, a study showed that a 3-day fast can essentially reset the immune system, providing many potential benefits. These benefits include better cardiovascular health, better endurance, lower blood pressure, and reduced inflammation.

Newer data, which I'll get to in a minute, shows that you might not have to fast nearly that long to get these benefits.

In the 2014 study, Valter Longo and colleagues at USC found that fasting lowered white blood cell counts, which in turn triggered the immune system to start producing new white blood cells. White blood cells (or lymphocytes) are a key component of your body’s immune system. Once you start eating again, according to Longo, your stem cells kick back into high gear to replenish the cells that were recycled.

The idea behind this strategy is that you have to fast for several days to get the benefits: basically, you have to fully deplete your energy reserves (in the form of glycogen), and it takes your body at least 24 hours, and probably 48 hours or more, to do this. That's the not-so-good news. The good news is that you probably only need to fast once or twice a year to gain the benefits that Longo described.

Last week, in a paper just published in the New England Journal of Medicine, Rafael de Cabo and Mark Mattson reviewed multiple strategies for fasting that have been tested in the years since Longo's study. The news continues to be very encouraging: intermittent fasting is good for you. I don't have time or space here to discuss all the results, but I want to focus on one fasting strategy that has surprisingly good benefits.

It turns out that you can get many of the benefits of fasting without doing a 3-day fast, which for most people is really, really difficult to accomplish. Instead, you can try a much easier type of fasting, called "time-restricted" fasting. With this strategy, you fast every day, by eating all of your food in a 6-hour or 8-hour window. Or you can go with the more difficult strategy (but still easier than a 3-day fast) where you fast for 2 entire days per week. Here, then, are two intermittent fasting strategies that have similar health benefits:

  • Time-restricted: eat lunch starting at 12 noon, and finish dinner by 8:00pm. Fast until the next day at noon (16 hour fast). Do this every day.
  • 5:2 fasting: fast for 2 different days each week, which means eating just 500-700 calories worth of food and drink on those days. Eat normally on the other 5 days.

The first strategy–a daily 16-hour fast–is the easiest, but its benefits might be equal to those of 5:2 fasting and the 3-day fast. (No study has directly compared these 3 fasting regimens.)

The benefits of intermittent fasting are numerous. As de Cabo and Mattson explain, they include
"[improvements in] blood pressure; resting heart rate; levels of HDL and LDL cholesterol, triglycerides, glucose, and insulin resistance.... In addition, intermittent fasting reduces markers of systemic inflammation and oxidative stress that are associated with atherosclerosis."
Fasting also helps with weight loss, for obvious reasons. Cutting out all snacks in the evening, which is the biggest change imposed by time-restricted fasting, means not only a reduction in calories consumed, but also a reduction in the amount of highly processed ("junk") foods in one's diet as well.

Furthermore, because intermittent fasting reduces inflammation, it may also improve symptoms of arthritis and even rheumatoid arthritis.

Why does fasting work? It's all about getting your body to switch over from glucose metabolism to ketone metabolism. Our usual 3-meal-a-day diet provides our body with a constant source of fuel in the form of glucose. Once that glucose is used up, though, our body switches to using fatty acids and ketone bodies. Ketone bodies provide more than fuel: as de Cabo and Mattson explain,
"Ketone bodies regulate the expression and activity of many proteins and molecules that are known to influence health and aging."
Ketone metabolism seems to bring a host of health benefits. The trick is getting our bodies to switch over to it, now and then. If we eat constantly, then our bodies happily subsist on glucose and never make the switch.

Does fasting truly reset your immune system? Six years ago, I concluded that a 3-day fast does the trick, at least partially. The science suggests that, if you can do it, a prolonged fast for 2-3 days will induce your body to clean out some old immune cells and switch on production of new ones. Now we're learning that intermittent fasting, which is easier to do, may work in much the same way, with multiple health benefits.

[Note: one of the authors of the NEJM study, Mark Mattson, is a Professor at Johns Hopkins School of Medicine, making him a colleague of mine. However, we are in different departments and we have never met.]

Let the 2010s be the end of the post-truth era

The 2010s are over, and the double-20s are about to begin. One phrase that I'd like to never hear again is "post-truth era," an idea that has gained prominence during the past decade, especially the last few years.

One dominant theme of post-truthery is that every person has a right to his/her own interpretation of the facts, and that these alternative interpretations–or "alternative facts," as Trump adviser Kellyanne Conway famously called them–deserve to be taken seriously. They don't.

And yet, like most of the pseudoscience that I've been writing about for years now, we have to continue calling out nonsense for what it is, because some of it is harmful, and even deadly. Politics is a never-ending font of post-truthery (or truthiness, as Stephen Colbert defined it in 2005), but here I'm concerned about science. Today I'll highlight three anti-truth campaigns that have caused great harm over the past decades, in the hope that they will soon fade away.

Among serious scientists, truth is not being questioned. Indeed, the essence of science is the search for objective truth. Sometimes, though, a scientific discovery threatens to undermine the power or profits of an influential industry, and that's when industry resorts to denialism (which is essentially the same thing as post-truthiness).

Perhaps the most well-documented example of organized science denialism dates back to the 1950s, when accumulating evidence made it clear that smoking causes cancer. The tobacco industry didn't want to admit this, even though their own internal research supported it, because it meant that their main product was killing their customers, which in turn was terrible news for their business. In response,
"the tobacco companies helped manufacture the smoking controversy by funding scientific research that was intended to obfuscate and prolong the debate about smoking and health." (Cummings et al. 2007).
Eventually, after decades of lawsuits and literally millions of smoking-related deaths, the industry was forced to admit the truth and pay out billions of dollars in settlements in the U.S. Nonetheless, many people still smoke, and even as recently as 2016, the largest tobacco company in the U.S., Philip Morris, was still trying to deny the science about cigarettes.

My second example is more recent, and potentially even more harmful to the human species. You might have already guessed it: I'm talking about climate change denialism. For several decades, the evidence has been building that the planet is getting warmer. A series of reports from the Intergovernmental Panel on Climate Change (IPCC) warned, in increasingly confident terms, that humans were the primary cause of recent warming, mostly due to our use of fossil fuels and the carbon they emit into the atmosphere.

The current IPCC report states unequivocally that humans have already caused 1° C of warming, and that the warming will increase rapidly over the next several decades.

Scientifically, these facts are not in dispute. The world is already experiencing more severe storms, unprecedented floods and droughts, and die-offs due to warming oceans (such as the massive and tragic die-off of coral in Great Barrier Reef).

However, the fossil fuel industry sees global warming as a threat to their profits. Rather than invest in new, cleaner forms of energy, large companies such as Exxon-Mobil and billionaire coal magnates such as the Koch brothers have poured countless millions of dollars into disinformation campaigns to cast doubt on the science of climate change. Prominent among these efforts is The Heartland Institute, a fossil-fuel-funded organization whose main mission is to cast doubt on the science of climate change. (Heartland has also worked to cast doubt on the link between smoking and cancer.)

Climate change denialists learned from the tobacco companies that it was possible to delay government action by decades, simply by casting doubt on the science. Unfortunately, their campaigns have been working. For years, many U.S. politicians denied that the planet was getting warmer. As that argument has become increasingly harder to make with a straight face, they've changed their strategy, and now they might admit that the planet is heating up, but deny that human activities are responsible. They're still wrong.

The goal of this denialism is, simply put, to protect the profits of fossil fuel industries. However, truth doesn't care if you believe it or not. The world is getting hotter. Australia just had its four hottest days in recorded history. The unprecedented heat wave has led to hundreds of fires through much of the country, threatening every major city on the continent. There's no reason to think this won't keep happening.

Climate change denialists are only fooling themselves–after being duped by oil and coal companies.

My third and final example is one that has frustrated me for 15 years now, and it's one that just won't go away: the anti-vaccination movement. This is one of the most frustrating examples of post-truth wrongheadedness, in part it seems so unnecessary, and because so many children have been harmed as a result.

First, let's be clear about the facts: vaccines are one of the greatest boons to human health in the history of science and medicine. People used to die by the millions from smallpox, polio, and a dozen other infections that are now almost completely preventable by vaccines.

Smallpox was 100% eliminated from the planet in 1980, in one of humankind's greatest public health triumphs. Polio has now been eradicated from nearly all countries, and a campaign that started in 1988 has now reduced polio to just a few countries and fewer than 100 cases worldwide. Both of these successes are due to vaccines.

Many other diseases, including measles, chicken pox, and bacterial meningitis, have been reduced so dramatically by vaccines that physicians in the U.S. and Europe almost never see a case. This graph shows the dramatic effect of the measles vaccine, which was introduced in the U.S. in the early 1960s:
Measles cases in the United States by year, 1954-2008. Source: CDC.
The graph shows the number of reported cases, but actual cases were much higher, likely around 4 million cases per year by one estimate. These are the facts.

So what happened? The modern anti-vax movement, led by a small number of extremely vocal, extremely self-confident individuals, began in 1998 with the publication of a fraudulent study (later retracted when the fraud was uncovered) claiming that vaccines caused autism. The study was led by former physician Andrew Wakefield, who later lost his medical license because of his fraud, which included misleading his co-authors and mistreating patients.

Nonetheless, this study was picked up and amplified by several celebrities with large followings, including former Playboy model and MTV host Jenny McCarthy and political activist Robert F. Kennedy Jr, the nephew of former US President John Kennedy. RFK Jr.

Anti-vax activism is all over the web today, despite many efforts to quash it. Countless claims of "my child got sick after his vaccine jab" are presented as proof that vaccines cause harm, and under post-truthism, we're supposed to take such claims seriously.

These stories can be especially difficult to respond to, because many parents are dealing with children who have genuine health problems–including autism–and the parents often truly believe what they are saying. Thus it won't do to tell them to be quiet and go away: their children really do need medical care. They're simply wrong in believing that vaccines have anything to do with their childrens' problems. Their beliefs are amplified by Facebook groups and websites whose sole purpose is to echo and amplify the mistaken claims of anti-vaxxers.

Even though there was never any good evidence that vaccines cause autism, scientists have conducted dozens of studies involving literally millions of children to answer precisely that question, and the evidence is very, very clear: vaccines do not cause autism, nor do they cause any other systematic neurological or behavioral problems. Vaccines do prevent diseases, though, and unvaccinated children can and will get sick. Recent experience has also given us, tragically, many examples of children who died from entirely preventable infections, because their parents didn't vaccinate them. These tragedies didn't have to happen.

Every era is the age of truth. The idea that we're in a "post-truth era," despite being repeated thousands of times in articles, essays, and op-eds over the past decade, is a commentary on people's ability to fool themselves, not on the state of the world. Truth describes the world as it is, and those who choose to deny it might win a short-term argument, but in the long term, they will always lose.

Finally, on a more positive note, I'm somewhat heartened by efforts to educate college students on how to recognize and counter bogus ideas, such as the University of Washington course, Calling Bullshit, created by Carl Bergstrom and Jevin West. (Yes, that really is its title.) As the professors wrote,
"The world is awash in bullshit.... We're sick of it. It's time to do something, and as educators, one constructive thing we know how to do is to teach people." 
If you're interested, they've made the lectures available as free videos on YouTube.

Labelling an era as post-anything has always, to me, suggested a lack of substance, as if the era is defined only by what came before it. The use of "post-" also ignores that fact that the next era will need a name too. Should we expect "post-post-truth" to come next, and what would that look like? Let's hope that the 2020s will give us a return to simple respect for truth. The future of civilization might very well depend on it.

Can the Apple Watch monitor heart health?

Can the Apple Watch accurately detect atrial fibrillation?

When I first heard that scientists were conducting a study to answer this question, I was deeply skeptical. A simple wrist device detecting heart arrhythmias? This seemed too simplistic to be possible.

But a new study, just published in the New England Journal of Medicine, lays out some pretty compelling evidence that the Apple watch can do just that.

Atrial fibrillation (or atrial flutter) is a type of irregular heartbeat that is the most common cardiac arrhythmia in the U.S., affecting some 6 million people a year. The NEJM article estimates that the lifetime risk for "a-fib" might be as high as 1 in 3. Atrial fibrillation isn't necessarily a problem on its own, but it can greatly increase the risk of strokes. Many people have episodes of a-fib without even being aware of them, which is why a simple, convenient way of detecting them could be medically valuable.

The Apple Watch study was truly enormous, with 419,297 subjects monitored for about 4 months. Only a company like Apple, with a popular product like its watch, can even hope to recruit this many people to join a study. The idea was pretty simple: use the Apple Watch to detect an irregular pulse, which might be sign of atrial flutter or arrhythmia.

During the course of the study, 2,161 subjects had at least one report of an irregular pulse, about 0.5% of the total. Each of these subjects was then sent an electrocardiogram (ECG) patch, which they were supposed to wear for several days to determine if they really were having arrhythmias.

Of course, not everyone wore the ECG patch as they were supposed to, but in the end, 450 subjects wore the ECG patch (for an average of about 6 days), filled in all the required questionnaires, and returned the patch for analysis. Among these subjects, 34% of them genuinely did have atrial fibrillation as confirmed by the ECG patch. Some of them had only brief episodes, but a subgroup had nearly continuous symptoms.

The study also tried to determine the false positive rate of the Apple Watch warnings–that is, how often did it report an irregular pulse when the subject was not experiencing a-fib or atrial flutter? The researchers evaluated all of the reports from watches that were being worn by people who also had an ECG patch. In this analysis, 71% of the irregular pulse reports from the watch corresponded to atrial fibrillation simultaneously measured by the ECG patch. The other 29% weren't normal either: three-fourths of those reports were due to "frequent premature atrial contractions."

So overall, the Watch was surprisingly accurate, with an impressively low false positive rate.

How does it work? Well, the back of the watch contains several sensors that detect light (photodiodes), along with green and infrared LEDs that emit light. As Apple's website explains,
"By flashing its LED lights hundreds of times per second, Apple Watch can calculate the number of times the heart beats each minute."
This works because your skin is partially transparent: as everyone knows, you can see some of your blood vessels underneath your skin. In addition to measuring blood flow, the watch can also measure electrical signals using electrodes on the back of the watch and on a small dial on the side of the watch, called the Digital Crown. Here's how Apple explains this:
"When you place your finger on the Digital Crown, it creates a closed circuit between your heart and both arms, capturing the electrical impulses across your chest."
In other words, it behaves like an ECG monitor on your wrist. The NEJM study didn't use this feature of the Apple Watch, which suggests that the watch could be even more effective as a heart monitor in the future.

The Apple Watch is far from perfect, though. For one thing, we don't know how many episodes of atrial fibrillation it missed. Only 0.5% of subjects had a report of irregular pulse, but we don't know how many of the remaining 99.5% had an episode that the watch didn't detect. The authors of the study pointed out that they weren't trying to measure the watch's sensitivity, and they emphasized that
"the absence of an irregular pulse notification does not exclude possible arrhythmias."
Another caveat is that the study was funded by Apple, although it was led by scientists at Stanford and included scientists from multiple other highly regarded universities and medical schools. The funding was clearly disclosed in the NEJM article.

On the other hand, using the Apple Watch is far easier than other currently available procedures for monitoring your heart. Patients who have episodes of arrhythmia are typically told to wear a heart-rate monitor for days or weeks at a time. This involves taping electrodes to half a dozen places on your body, all of which are connected by wires to a device (basically a cell phone) that records the readings and sends them to a monitoring company. These monitors are very expensive to operate, far more than the Apple Watch.

So despite its imperfections, the Apple Watch might be the vanguard of a new wave of lightweight, less intrusive devices for monitoring our health. The technology is only going to get better.

$3.7 million to study quack medicine at a leading cancer center

Sometimes I'm not sure whether the best response to pseudoscience is to ignore it, or to patiently try to explain why it's wrong, or to get mad.

This week I'm mad.

My anger and frustration was triggered by a tweet from Memorial Sloan-Kettering's Integrative Medicine account, shown here:
For those who don't know, Memorial Sloan-Kettering Cancer Center is one of the world's leading cancer centers, both for treatment and research. If you are diagnosed with cancer, MSK is one of the best places to go.

But not everything at MSK is world class. Unfortunately, they have an "integrative medicine" center that offers a mixture of therapies ranging from helpful to benign to useless. One of their biggest activities is acupuncture, which they claim offers a wide range of benefits to cancer patients.

The MSK tweet shown here was boasting about a new, $3.7 million study funded by NIH to study the effect of acupuncture on pain that cancer patients experience from chemotherapy and bone-marrow transplants.

Here's why I'm mad: cancer patients are extremely vulnerable, often suffering the most frightening and difficult experience of their lives. They are completely dependent on medical experts to help them. When a place like MSK suggests a treatment, patients take it very seriously–as they should. But they really have no choice: a cancer patient cannot easily look for a second opinion, or switch hospitals or doctors. Even if they have the money (and cancer treatment is extremely expensive), switching hospitals might involve a long interruption with no treatment, during which they could die, and it might also involve traveling far from their home.

Offering these patients ineffective treatments based on pseudoscience–and make no mistake, that's what acupuncture is–is immoral. Now, I strongly suspect that the MSK's "integrative medicine" doctors sincerely believe that acupuncture works. Their director, Jun Mao, is clearly a true believer, as explained in this profile of him on the MSK website. But that doesn't make it okay.

I've written about acupuncture many times before (here, here, here, and here, for example), but let me explain afresh why it is nonsense.

Acupuncture is based on a pre-scientific notion, invented long before humans understood physiology, chemistry, neurology, or even basic physics, which posits that a mysterious life force, called "qi," flows through the body on energy lines called meridians. As explained in this article by MSK's Jun Mao:
"According to traditional Chinese medicine ... interruption or obstruction of qi was believed to make one vulnerable to illness. The insertion of needles at specific meridian acupoints was thought to regulate the flow of qi, thus producing therapeutic benefit."
Today we know that none of this exists. There is no qi, and there are no meridians. In that same article, Jun Mao continued by admitting that
"the ideas of qi and meridians are inconsistent with the modern understanding of human anatomy and physiology."
And yet this is what they offer to patients at MSK.

Just to be certain, I read one of the latest studies from MSK, published early this year, which claims to show that acupuncture relieves nausea, drowsiness, and lack of appetite in multiple myeloma patients who were going through stem cell transplants.

It's a mess: totally unconvincing, and a textbook case of p-hacking (or data dredging). The paper describes a very small study, with just 60 patients total, in which they measured literally dozens of possible outcomes: overall symptom score at 3 different time points, a different score at 3 time points, each of 13 symptoms individually, and more. I counted 24 different p-values, most of them not even close to significant, but they fixated on the 3 that reached statistical significance. The two groups of 30 patients weren't properly balanced: the sham acupuncture group started out with more severe symptoms according to their own scoring metric, and Figure 2 in the paper makes it pretty clear that there was no genuine difference in the effects of real versus sham acupuncture.

But they got it published (in a mediocre journal), so now they point to it as "proof" that acupuncture works for cancer patients. This study, bad as it is, appears to be the basis of the $3.7 million NIH grant that they're now going to use, they say, in "a larger study in 300 patients to confirm these previous findings."

And there you go: the goal of the new study, according to the scientists themselves, is not to see if the treatment works, but to confirm their pre-existing belief that acupuncture works. Or, as one scientist remarked on Twitter, "they already have a result in mind, the whole wording of this suggests that they EXPECT a positive outcome. How did this get funded exactly?"

Good question.

So I'm mad. I'm mad that NIH is spending millions of dollars on yet another study of a quack treatment (acupuncture) that should have been abandoned decades ago, but that persists because people make money off it. (And, as others have explained in detail, acupuncture is actually a huge scam that former Chinese dictator Mao Zedong foisted on his own people, because he couldn't afford to offer them real medicine. For a good exposé of Chairman Mao's scam, see this 2013 Slate piece.)

But I'm even more upset that doctors at one of the world's leading cancer centers are telling desperately ill patients, who trust them with their lives, that sticking needles into their bodies at bogus "acupuncture points" will relieve the pain and nausea of chemotherapy, or help them with other symptoms of cancer. I'm willing to bet that most MSK doctors don't believe any of this, but they don't want to invest the time or energy to try to stop it.

(I am somewhat reassured by the fact that MSK's Twitter account has nearly 75,000 followers, while it's integrative medicine Twitter account has just 110.)

Or perhaps they are "shruggies": doctors who don't believe in nonsense, but figure it's probably harmless so they don't really object. To them I suggest this: read Dr. Val Jones's account of how she too was a shruggie, until she realized that pseudoscience causes real harm.

And finally, let me point to this study in JAMA Oncology from last year, by doctors from Yale, which looked at the use of so-called complementary therapies among cancer patients. They found that
"Patients who received complementary medicine were more likely to refuse other conventional cancer treatment, and had a higher risk of death than no complementary medicine."
And also see this 2017 study from the Journal of the National Cancer Institute, which found that patients who used alternative medicine were 2.5 times more likely to die than patients who stuck to modern medical treatments.

That's right, Memorial Sloan-Kettering: patients who use non-traditional therapies are twice as likely to die. That's why I'm mad. This is not okay.

Gluten-free diet has no effect on autism, according to a new study

Parents of autistic children are constantly seeking new treatments. Autism spectrum disorder, or ASD, is a developmental disorder that causes problems in social interaction and communication, ranging from mild to severe. It's an extremely challenging condition for parents, and as of today there is no cure.

However, there are plenty of websites that offer treatments for autism, many of them unproven. One of the more common claims is that autistic children will benefit from a gluten-free, casein-free diet. There has been some weakly supportive evidence for this idea, such as this 2012 report from Penn State, but that study was based entirely on interviews with parents. Interviews are notoriously unreliable for scientific data collection.

Perhaps because so few effective treatments are available, many parents of autistic children have tried gluten-free diets, in the hope that perhaps they might work. (One can find entire books dedicated to this diet.)

The science behind the idea that gluten or casein causes (or worsens) autism has always been sketchy. The push for diet-based treatments has its origins in the anti-vaccine movement, beginning with the fraudulent 1998 study (eventually retracted) in The Lancet led by Andrew Wakefield, a former gastroenterologist who lost his medical license after his fraud was discovered. Wakefield claimed that autism was caused by a "leaky gut," which somehow allowed vaccine particles to make their way to the brain, which in turn caused autism. That chain of events was never supported by scientific evidence. Nonetheless, it morphed into the hypothesis that gluten (or casein) somehow leaks out of the intestines and causes some symptoms of autism. There's no evidence to support that either.

(Despite losing his medical license, Wakefield has become a leading voice in the anti-vaccine movement, making speeches and even movies to try to convince parents not to vaccinate their kids. Many journalists and scientists have written about him and the harm that he's done, but that's not my topic today.)

Another hypothesis, according to WebMD, is that autistic children have some kind of allergic reaction to gluten. There is no good evidence for this either.

Surprisingly, virtually no good studies have asked the question, do gluten-containing foods actually cause the symptoms of autism? Now we have a carefully-done study that provides an answer.

The new study, just published in the Journal of Autism and Developmental Disorders, is the first randomized, well-controlled study of gluten-free diets in children with autism. The scientists, all from the University of Warsaw, Poland, recruited 66 children, and assigned half of them at random to a gluten-free diet. The other half were given a normal diet, with at least one meal a day containing gluten, for 6 months. The children ranged from 3 to 5 years old. After 6 months, the scientists evaluated all children using multiple standardized measurements of autistic behavior.

The results were very clear: the study found no difference between the diets. None of the core symptoms of ASD were different between children in the two groups, and there were no differences in gastrointestinal symptoms either. As the study itself stated:
"There is no evidence either against or in favor of gluten avoidance for managing symptoms of ASD in children." 
This study should put to rest all of the claims that a gluten-free diet can somehow improve the symptoms of autism. It doesn't provide an easy answer for parents, and the medical community still needs to do much more work to find better treatments. But let's hope that parents get the message: don't feed your autistic child a restricted diet.

What's the proper amount of dog for optimal health?

Humans have had dogs as companions for thousands of years. Over that time, dogs have evolved to become ever-better companions, as we humans selectively bred them for traits that we like, such as friendliness and loyalty.

Dog owners already know that owning a dog reduces stress. But it turns out that the health benefits of owning a dog go quite a bit further: two new studies published this month in the journal Circulation both found that owning a dog reduces your risk of dying.

The first study, by Carolyn Kramer and colleagues at the University of Toronto, reviewed ten other studies dating back more than 50 years, covering 3.8 million people. They compared dog owners to non-owners and found that dog owners had a 24% lower risk of dying, from any cause, over a 10-year period. The benefit was even greater for people who'd suffered a heart attack: those who had a dog at home after their heart attack had a 65% lower risk of dying.

The second study, by Tove Fall and colleagues at Uppsala University, focused on the benefits of owning a dog for people who have had a heart attack or a stroke. They used the Swedish National Patient Register to identify 335,000 patients who'd suffered one of these events between 2000 and 2012, about 5% of whom were dog owners. They found even greater benefits than the first study: among people who'd had a heart attack, their risk of dying was 33% lower if they owned a dog as compared to people who lived alone. The benefits were smaller but still significant for people who lived with a companion (a spouse or a child): they still had a 15% lower risk of dying if they also owned a dog. For those who'd had a stroke, the risk of dying for dog owners was 27% lower than for people who lived along, and 12% lower than for people who lived with a companion but didn't have a dog. This study measured the risk over a 4-5 year followup period.

These studies are consistent with many other scientific reports, stretching back decades. They're all consistent, and they all point in the same direction: dog ownership is good for your health. In fact, back in 2013 the American Heart Association issued an official statement on "Pet Ownership and Cardiovascular Risk" with this recommendation:
"Pet ownership, particularly dog ownership, may be reasonable for reduction in cardiovascular disease risk."
However, because the evidence was not very strong, the AHA also advised that people shouldn't get a pet "for the primary purpose of reducing CVD risk." In other words, don't get a dog if you don't want one. As every dog owner knows, owning a dog is much more trouble than simply taking a daily pill.

The new studies strengthen the existing evidence for the health benefits of owning a dog. In an accompanying editorial in Circulation, Dhruv Kazi from Harvard Medical School asks a critical question: is the association between dog ownership and reduced mortality just a correlation, or is it causal? He points out that studies have shown that dog ownership reduces blood pressure and other signs of stress, and that dog owners tend to get outside and walk more (with their dogs). Thus it's very plausible, medically speaking, that dog ownership is good for you. For these and other reasons, Kazi concludes that
"the association between dog ownership and improved survival is real, and is likely at least partially causal."
One final question is still nagging at me, though. Now that we know that dog ownership is good for your health, what's the optimal dose? Would it be even healthier to own two dogs rather than one? And what if we throw in a cat, does that strengthen or reduce the effect? Finally, is it healthier to own a larger dog, or is a small one just as good?

Clearly, more research is needed.

[Note: the author discloses that he owns a rescue dog, a rather small terrier.]

Can You Improve Your Memory With A Jellyfish Protein?

Some colleagues of mine recently asked me about Prevagen, a supplement that is being advertised heavily on television as a memory booster. It's everywhere, they said–but what is it? And does it work?

Both questions are pretty easy to answer. On the first question, the TL;DR version is that Prevagen's primary ingredient is a protein called apoaequorin, which is found in a species of jellyfish that glows in the dark. These jellies produce two proteins, apoaequorin and green fluorescent protein (GFP), that help them fluoresce. It’s an amazing biological system, and the three scientists who discovered and developed the chemistry of GFP were awarded the 2008 Nobel Prize in chemistry.

Cool science! But what does this have to do with human memory? Not much, it turns out.

First let's examine what Prevagen's manufacturers, Quincy Bioscience, say about it. Their website claims that:
"Prevagen Improves Memory* 
Prevagen is a dietary supplement that has been clinically shown to help with mild memory loss associated with aging.* Prevagen is formulated with apoaequorin, which is safe and uniquely supports brain function.*"
Sounds pretty clear, right? But note the asterisks by each of these claims: if you scroll all the way down (or read the small print on their packages), you'll find out that:
"*These statements have not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure or prevent any disease."
You may recognize this language: it's what all supplement manufacturers use to avoid getting in trouble with the FDA. It means, essentially, that the government hasn't approved Prevagen to treat anything, including memory loss.

Despite Quincy’s claims, I see no reason why eating this protein would have any effect at all on brain function. First of all, it’s not even a human protein, so it's unlikely to work in humans. Second, even if it did work in humans, eating it would not deliver it to our brains, because it would be almost certainly be broken down in the stomach. And third, the connection between any protein and memory is very complex, so simply having more of a protein is very, very unlikely to improve memory.

Quincy's website points to a single study that they themselves conducted, which they argue showed benefits for people with mild memory impairment. However, others have pointed out that the experiment (which was never published in a scientific journal) didn't show any such thing: overall there was no difference between people who took Prevagen and those who took a placebo, but the manufacturer's did some p-hacking to extract a subgroup that appeared to get a benefit. As Dr. Harriett Hall and others have pointed out, this kind of p-hacking is bogus.

And what about my observation that the jellyfish protein will simply be digested in the stomach, and never make it to the brain? It turns out that the company itself admits that I'm right. On their website, they have a "research" page pointing to several safety studies, designed to show that Prevagen won't cause an immune reaction. One of these studies explains that
"Apoaequorin is easily digested by pepsin."
Pepsin is the chief digestive enzyme in your stomach. So Prevagen's main ingredient never gets beyond the stomach, which is why it's probably quite safe. (Joe Schwarcz at McGill University recently made the same point.)

Back in 2015, I asked Ted Dawson, the Abramson Professor of Neurodegenerative Diseases at Johns Hopkins School of Medicine, what he thought of Prevagen’s claims.
“It is hard to evaluate Prevagen as to the best of my knowledge there is no peer-reviewed publication on its use in memory and cognition,” said Dawson. “The study cited on the company’s web site is a small short study, raising concerns about the validity of the claims.”
Finally, a word to those who are still tempted to try Prevagen: it isn't cheap. Their website charges $75 for a bottle of 60 pills, each containing 10 mg of apoaequorin, or $90 for 30 pills of the "Professional formula," which contain 40 mg. (Note that there's no evidence that taking a higher dose will work any better.)

The FTC sued Quincy Bioscience in 2017 for deceptive advertising, arguing that claims that Prevagen boosts memory are false, and that claims it can get into the brain are also false. Just a few months ago, a judge ruled that the case can proceed. Meanwhile, though, the advertising and sales of Prevagen continue. The FTC case states that Quincy sold $165 million worth of Prevagen in the U.S. from 2007 to 2015.

So the bottom line is: jellyfish proteins are very cool, but eating them won't improve your memory. If you're interested in brain food, perhaps you just eat more fish, which might actually work.

(Note: I wrote about Prevagen in 2015, and some elements of this article are based on my earlier one.)