Let the 2010s be the end of the post-truth era


The 2010s are over, and the double-20s are about to begin. One phrase that I'd like to never hear again is "post-truth era," an idea that has gained prominence during the past decade, especially the last few years.

One dominant theme of post-truthery is that every person has a right to his/her own interpretation of the facts, and that these alternative interpretations–or "alternative facts," as Trump adviser Kellyanne Conway famously called them–deserve to be taken seriously. They don't.

And yet, like most of the pseudoscience that I've been writing about for years now, we have to continue calling out nonsense for what it is, because some of it is harmful, and even deadly. Politics is a never-ending font of post-truthery (or truthiness, as Stephen Colbert defined it in 2005), but here I'm concerned about science. Today I'll highlight three anti-truth campaigns that have caused great harm over the past decades, in the hope that they will soon fade away.

Among serious scientists, truth is not being questioned. Indeed, the essence of science is the search for objective truth. Sometimes, though, a scientific discovery threatens to undermine the power or profits of an influential industry, and that's when industry resorts to denialism (which is essentially the same thing as post-truthiness).

Perhaps the most well-documented example of organized science denialism dates back to the 1950s, when accumulating evidence made it clear that smoking causes cancer. The tobacco industry didn't want to admit this, even though their own internal research supported it, because it meant that their main product was killing their customers, which in turn was terrible news for their business. In response,
"the tobacco companies helped manufacture the smoking controversy by funding scientific research that was intended to obfuscate and prolong the debate about smoking and health." (Cummings et al. 2007).
Eventually, after decades of lawsuits and literally millions of smoking-related deaths, the industry was forced to admit the truth and pay out billions of dollars in settlements in the U.S. Nonetheless, many people still smoke, and even as recently as 2016, the largest tobacco company in the U.S., Philip Morris, was still trying to deny the science about cigarettes.

My second example is more recent, and potentially even more harmful to the human species. You might have already guessed it: I'm talking about climate change denialism. For several decades, the evidence has been building that the planet is getting warmer. A series of reports from the Intergovernmental Panel on Climate Change (IPCC) warned, in increasingly confident terms, that humans were the primary cause of recent warming, mostly due to our use of fossil fuels and the carbon they emit into the atmosphere.

The current IPCC report states unequivocally that humans have already caused 1° C of warming, and that the warming will increase rapidly over the next several decades.

Scientifically, these facts are not in dispute. The world is already experiencing more severe storms, unprecedented floods and droughts, and die-offs due to warming oceans (such as the massive and tragic die-off of coral in Great Barrier Reef).

However, the fossil fuel industry sees global warming as a threat to their profits. Rather than invest in new, cleaner forms of energy, large companies such as Exxon-Mobil and billionaire coal magnates such as the Koch brothers have poured countless millions of dollars into disinformation campaigns to cast doubt on the science of climate change. Prominent among these efforts is The Heartland Institute, a fossil-fuel-funded organization whose main mission is to cast doubt on the science of climate change. (Heartland has also worked to cast doubt on the link between smoking and cancer.)

Climate change denialists learned from the tobacco companies that it was possible to delay government action by decades, simply by casting doubt on the science. Unfortunately, their campaigns have been working. For years, many U.S. politicians denied that the planet was getting warmer. As that argument has become increasingly harder to make with a straight face, they've changed their strategy, and now they might admit that the planet is heating up, but deny that human activities are responsible. They're still wrong.

The goal of this denialism is, simply put, to protect the profits of fossil fuel industries. However, truth doesn't care if you believe it or not. The world is getting hotter. Australia just had its four hottest days in recorded history. The unprecedented heat wave has led to hundreds of fires through much of the country, threatening every major city on the continent. There's no reason to think this won't keep happening.

Climate change denialists are only fooling themselves–after being duped by oil and coal companies.

My third and final example is one that has frustrated me for 15 years now, and it's one that just won't go away: the anti-vaccination movement. This is one of the most frustrating examples of post-truth wrongheadedness, in part it seems so unnecessary, and because so many children have been harmed as a result.

First, let's be clear about the facts: vaccines are one of the greatest boons to human health in the history of science and medicine. People used to die by the millions from smallpox, polio, and a dozen other infections that are now almost completely preventable by vaccines.

Smallpox was 100% eliminated from the planet in 1980, in one of humankind's greatest public health triumphs. Polio has now been eradicated from nearly all countries, and a campaign that started in 1988 has now reduced polio to just a few countries and fewer than 100 cases worldwide. Both of these successes are due to vaccines.

Many other diseases, including measles, chicken pox, and bacterial meningitis, have been reduced so dramatically by vaccines that physicians in the U.S. and Europe almost never see a case. This graph shows the dramatic effect of the measles vaccine, which was introduced in the U.S. in the early 1960s:
Measles cases in the United States by year, 1954-2008. Source: CDC.
The graph shows the number of reported cases, but actual cases were much higher, likely around 4 million cases per year by one estimate. These are the facts.

So what happened? The modern anti-vax movement, led by a small number of extremely vocal, extremely self-confident individuals, began in 1998 with the publication of a fraudulent study (later retracted when the fraud was uncovered) claiming that vaccines caused autism. The study was led by former physician Andrew Wakefield, who later lost his medical license because of his fraud, which included misleading his co-authors and mistreating patients.

Nonetheless, this study was picked up and amplified by several celebrities with large followings, including former Playboy model and MTV host Jenny McCarthy and political activist Robert F. Kennedy Jr, the nephew of former US President John Kennedy. RFK Jr.

Anti-vax activism is all over the web today, despite many efforts to quash it. Countless claims of "my child got sick after his vaccine jab" are presented as proof that vaccines cause harm, and under post-truthism, we're supposed to take such claims seriously.

These stories can be especially difficult to respond to, because many parents are dealing with children who have genuine health problems–including autism–and the parents often truly believe what they are saying. Thus it won't do to tell them to be quiet and go away: their children really do need medical care. They're simply wrong in believing that vaccines have anything to do with their childrens' problems. Their beliefs are amplified by Facebook groups and websites whose sole purpose is to echo and amplify the mistaken claims of anti-vaxxers.

Even though there was never any good evidence that vaccines cause autism, scientists have conducted dozens of studies involving literally millions of children to answer precisely that question, and the evidence is very, very clear: vaccines do not cause autism, nor do they cause any other systematic neurological or behavioral problems. Vaccines do prevent diseases, though, and unvaccinated children can and will get sick. Recent experience has also given us, tragically, many examples of children who died from entirely preventable infections, because their parents didn't vaccinate them. These tragedies didn't have to happen.

Every era is the age of truth. The idea that we're in a "post-truth era," despite being repeated thousands of times in articles, essays, and op-eds over the past decade, is a commentary on people's ability to fool themselves, not on the state of the world. Truth describes the world as it is, and those who choose to deny it might win a short-term argument, but in the long term, they will always lose.

Finally, on a more positive note, I'm somewhat heartened by efforts to educate college students on how to recognize and counter bogus ideas, such as the University of Washington course, Calling Bullshit, created by Carl Bergstrom and Jevin West. (Yes, that really is its title.) As the professors wrote,
"The world is awash in bullshit.... We're sick of it. It's time to do something, and as educators, one constructive thing we know how to do is to teach people." 
If you're interested, they've made the lectures available as free videos on YouTube.

Labelling an era as post-anything has always, to me, suggested a lack of substance, as if the era is defined only by what came before it. The use of "post-" also ignores that fact that the next era will need a name too. Should we expect "post-post-truth" to come next, and what would that look like? Let's hope that the 2020s will give us a return to simple respect for truth. The future of civilization might very well depend on it.

Can the Apple Watch monitor heart health?

Can the Apple Watch accurately detect atrial fibrillation?

When I first heard that scientists were conducting a study to answer this question, I was deeply skeptical. A simple wrist device detecting heart arrhythmias? This seemed too simplistic to be possible.

But a new study, just published in the New England Journal of Medicine, lays out some pretty compelling evidence that the Apple watch can do just that.

Atrial fibrillation (or atrial flutter) is a type of irregular heartbeat that is the most common cardiac arrhythmia in the U.S., affecting some 6 million people a year. The NEJM article estimates that the lifetime risk for "a-fib" might be as high as 1 in 3. Atrial fibrillation isn't necessarily a problem on its own, but it can greatly increase the risk of strokes. Many people have episodes of a-fib without even being aware of them, which is why a simple, convenient way of detecting them could be medically valuable.

The Apple Watch study was truly enormous, with 419,297 subjects monitored for about 4 months. Only a company like Apple, with a popular product like its watch, can even hope to recruit this many people to join a study. The idea was pretty simple: use the Apple Watch to detect an irregular pulse, which might be sign of atrial flutter or arrhythmia.

During the course of the study, 2,161 subjects had at least one report of an irregular pulse, about 0.5% of the total. Each of these subjects was then sent an electrocardiogram (ECG) patch, which they were supposed to wear for several days to determine if they really were having arrhythmias.

Of course, not everyone wore the ECG patch as they were supposed to, but in the end, 450 subjects wore the ECG patch (for an average of about 6 days), filled in all the required questionnaires, and returned the patch for analysis. Among these subjects, 34% of them genuinely did have atrial fibrillation as confirmed by the ECG patch. Some of them had only brief episodes, but a subgroup had nearly continuous symptoms.

The study also tried to determine the false positive rate of the Apple Watch warnings–that is, how often did it report an irregular pulse when the subject was not experiencing a-fib or atrial flutter? The researchers evaluated all of the reports from watches that were being worn by people who also had an ECG patch. In this analysis, 71% of the irregular pulse reports from the watch corresponded to atrial fibrillation simultaneously measured by the ECG patch. The other 29% weren't normal either: three-fourths of those reports were due to "frequent premature atrial contractions."

So overall, the Watch was surprisingly accurate, with an impressively low false positive rate.

How does it work? Well, the back of the watch contains several sensors that detect light (photodiodes), along with green and infrared LEDs that emit light. As Apple's website explains,
"By flashing its LED lights hundreds of times per second, Apple Watch can calculate the number of times the heart beats each minute."
This works because your skin is partially transparent: as everyone knows, you can see some of your blood vessels underneath your skin. In addition to measuring blood flow, the watch can also measure electrical signals using electrodes on the back of the watch and on a small dial on the side of the watch, called the Digital Crown. Here's how Apple explains this:
"When you place your finger on the Digital Crown, it creates a closed circuit between your heart and both arms, capturing the electrical impulses across your chest."
In other words, it behaves like an ECG monitor on your wrist. The NEJM study didn't use this feature of the Apple Watch, which suggests that the watch could be even more effective as a heart monitor in the future.

The Apple Watch is far from perfect, though. For one thing, we don't know how many episodes of atrial fibrillation it missed. Only 0.5% of subjects had a report of irregular pulse, but we don't know how many of the remaining 99.5% had an episode that the watch didn't detect. The authors of the study pointed out that they weren't trying to measure the watch's sensitivity, and they emphasized that
"the absence of an irregular pulse notification does not exclude possible arrhythmias."
Another caveat is that the study was funded by Apple, although it was led by scientists at Stanford and included scientists from multiple other highly regarded universities and medical schools. The funding was clearly disclosed in the NEJM article.

On the other hand, using the Apple Watch is far easier than other currently available procedures for monitoring your heart. Patients who have episodes of arrhythmia are typically told to wear a heart-rate monitor for days or weeks at a time. This involves taping electrodes to half a dozen places on your body, all of which are connected by wires to a device (basically a cell phone) that records the readings and sends them to a monitoring company. These monitors are very expensive to operate, far more than the Apple Watch.

So despite its imperfections, the Apple Watch might be the vanguard of a new wave of lightweight, less intrusive devices for monitoring our health. The technology is only going to get better.

$3.7 million to study quack medicine at a leading cancer center

Sometimes I'm not sure whether the best response to pseudoscience is to ignore it, or to patiently try to explain why it's wrong, or to get mad.

This week I'm mad.

My anger and frustration was triggered by a tweet from Memorial Sloan-Kettering's Integrative Medicine account, shown here:
For those who don't know, Memorial Sloan-Kettering Cancer Center is one of the world's leading cancer centers, both for treatment and research. If you are diagnosed with cancer, MSK is one of the best places to go.

But not everything at MSK is world class. Unfortunately, they have an "integrative medicine" center that offers a mixture of therapies ranging from helpful to benign to useless. One of their biggest activities is acupuncture, which they claim offers a wide range of benefits to cancer patients.

The MSK tweet shown here was boasting about a new, $3.7 million study funded by NIH to study the effect of acupuncture on pain that cancer patients experience from chemotherapy and bone-marrow transplants.

Here's why I'm mad: cancer patients are extremely vulnerable, often suffering the most frightening and difficult experience of their lives. They are completely dependent on medical experts to help them. When a place like MSK suggests a treatment, patients take it very seriously–as they should. But they really have no choice: a cancer patient cannot easily look for a second opinion, or switch hospitals or doctors. Even if they have the money (and cancer treatment is extremely expensive), switching hospitals might involve a long interruption with no treatment, during which they could die, and it might also involve traveling far from their home.

Offering these patients ineffective treatments based on pseudoscience–and make no mistake, that's what acupuncture is–is immoral. Now, I strongly suspect that the MSK's "integrative medicine" doctors sincerely believe that acupuncture works. Their director, Jun Mao, is clearly a true believer, as explained in this profile of him on the MSK website. But that doesn't make it okay.

I've written about acupuncture many times before (here, here, here, and here, for example), but let me explain afresh why it is nonsense.

Acupuncture is based on a pre-scientific notion, invented long before humans understood physiology, chemistry, neurology, or even basic physics, which posits that a mysterious life force, called "qi," flows through the body on energy lines called meridians. As explained in this article by MSK's Jun Mao:
"According to traditional Chinese medicine ... interruption or obstruction of qi was believed to make one vulnerable to illness. The insertion of needles at specific meridian acupoints was thought to regulate the flow of qi, thus producing therapeutic benefit."
Today we know that none of this exists. There is no qi, and there are no meridians. In that same article, Jun Mao continued by admitting that
"the ideas of qi and meridians are inconsistent with the modern understanding of human anatomy and physiology."
And yet this is what they offer to patients at MSK.

Just to be certain, I read one of the latest studies from MSK, published early this year, which claims to show that acupuncture relieves nausea, drowsiness, and lack of appetite in multiple myeloma patients who were going through stem cell transplants.

It's a mess: totally unconvincing, and a textbook case of p-hacking (or data dredging). The paper describes a very small study, with just 60 patients total, in which they measured literally dozens of possible outcomes: overall symptom score at 3 different time points, a different score at 3 time points, each of 13 symptoms individually, and more. I counted 24 different p-values, most of them not even close to significant, but they fixated on the 3 that reached statistical significance. The two groups of 30 patients weren't properly balanced: the sham acupuncture group started out with more severe symptoms according to their own scoring metric, and Figure 2 in the paper makes it pretty clear that there was no genuine difference in the effects of real versus sham acupuncture.

But they got it published (in a mediocre journal), so now they point to it as "proof" that acupuncture works for cancer patients. This study, bad as it is, appears to be the basis of the $3.7 million NIH grant that they're now going to use, they say, in "a larger study in 300 patients to confirm these previous findings."

And there you go: the goal of the new study, according to the scientists themselves, is not to see if the treatment works, but to confirm their pre-existing belief that acupuncture works. Or, as one scientist remarked on Twitter, "they already have a result in mind, the whole wording of this suggests that they EXPECT a positive outcome. How did this get funded exactly?"

Good question.

So I'm mad. I'm mad that NIH is spending millions of dollars on yet another study of a quack treatment (acupuncture) that should have been abandoned decades ago, but that persists because people make money off it. (And, as others have explained in detail, acupuncture is actually a huge scam that former Chinese dictator Mao Zedong foisted on his own people, because he couldn't afford to offer them real medicine. For a good exposé of Chairman Mao's scam, see this 2013 Slate piece.)

But I'm even more upset that doctors at one of the world's leading cancer centers are telling desperately ill patients, who trust them with their lives, that sticking needles into their bodies at bogus "acupuncture points" will relieve the pain and nausea of chemotherapy, or help them with other symptoms of cancer. I'm willing to bet that most MSK doctors don't believe any of this, but they don't want to invest the time or energy to try to stop it.

(I am somewhat reassured by the fact that MSK's Twitter account has nearly 75,000 followers, while it's integrative medicine Twitter account has just 110.)

Or perhaps they are "shruggies": doctors who don't believe in nonsense, but figure it's probably harmless so they don't really object. To them I suggest this: read Dr. Val Jones's account of how she too was a shruggie, until she realized that pseudoscience causes real harm.

And finally, let me point to this study in JAMA Oncology from last year, by doctors from Yale, which looked at the use of so-called complementary therapies among cancer patients. They found that
"Patients who received complementary medicine were more likely to refuse other conventional cancer treatment, and had a higher risk of death than no complementary medicine."
And also see this 2017 study from the Journal of the National Cancer Institute, which found that patients who used alternative medicine were 2.5 times more likely to die than patients who stuck to modern medical treatments.

That's right, Memorial Sloan-Kettering: patients who use non-traditional therapies are twice as likely to die. That's why I'm mad. This is not okay.

Gluten-free diet has no effect on autism, according to a new study

Parents of autistic children are constantly seeking new treatments. Autism spectrum disorder, or ASD, is a developmental disorder that causes problems in social interaction and communication, ranging from mild to severe. It's an extremely challenging condition for parents, and as of today there is no cure.

However, there are plenty of websites that offer treatments for autism, many of them unproven. One of the more common claims is that autistic children will benefit from a gluten-free, casein-free diet. There has been some weakly supportive evidence for this idea, such as this 2012 report from Penn State, but that study was based entirely on interviews with parents. Interviews are notoriously unreliable for scientific data collection.

Perhaps because so few effective treatments are available, many parents of autistic children have tried gluten-free diets, in the hope that perhaps they might work. (One can find entire books dedicated to this diet.)

The science behind the idea that gluten or casein causes (or worsens) autism has always been sketchy. The push for diet-based treatments has its origins in the anti-vaccine movement, beginning with the fraudulent 1998 study (eventually retracted) in The Lancet led by Andrew Wakefield, a former gastroenterologist who lost his medical license after his fraud was discovered. Wakefield claimed that autism was caused by a "leaky gut," which somehow allowed vaccine particles to make their way to the brain, which in turn caused autism. That chain of events was never supported by scientific evidence. Nonetheless, it morphed into the hypothesis that gluten (or casein) somehow leaks out of the intestines and causes some symptoms of autism. There's no evidence to support that either.

(Despite losing his medical license, Wakefield has become a leading voice in the anti-vaccine movement, making speeches and even movies to try to convince parents not to vaccinate their kids. Many journalists and scientists have written about him and the harm that he's done, but that's not my topic today.)

Another hypothesis, according to WebMD, is that autistic children have some kind of allergic reaction to gluten. There is no good evidence for this either.

Surprisingly, virtually no good studies have asked the question, do gluten-containing foods actually cause the symptoms of autism? Now we have a carefully-done study that provides an answer.

The new study, just published in the Journal of Autism and Developmental Disorders, is the first randomized, well-controlled study of gluten-free diets in children with autism. The scientists, all from the University of Warsaw, Poland, recruited 66 children, and assigned half of them at random to a gluten-free diet. The other half were given a normal diet, with at least one meal a day containing gluten, for 6 months. The children ranged from 3 to 5 years old. After 6 months, the scientists evaluated all children using multiple standardized measurements of autistic behavior.

The results were very clear: the study found no difference between the diets. None of the core symptoms of ASD were different between children in the two groups, and there were no differences in gastrointestinal symptoms either. As the study itself stated:
"There is no evidence either against or in favor of gluten avoidance for managing symptoms of ASD in children." 
This study should put to rest all of the claims that a gluten-free diet can somehow improve the symptoms of autism. It doesn't provide an easy answer for parents, and the medical community still needs to do much more work to find better treatments. But let's hope that parents get the message: don't feed your autistic child a restricted diet.

What's the proper amount of dog for optimal health?

Humans have had dogs as companions for thousands of years. Over that time, dogs have evolved to become ever-better companions, as we humans selectively bred them for traits that we like, such as friendliness and loyalty.

Dog owners already know that owning a dog reduces stress. But it turns out that the health benefits of owning a dog go quite a bit further: two new studies published this month in the journal Circulation both found that owning a dog reduces your risk of dying.

The first study, by Carolyn Kramer and colleagues at the University of Toronto, reviewed ten other studies dating back more than 50 years, covering 3.8 million people. They compared dog owners to non-owners and found that dog owners had a 24% lower risk of dying, from any cause, over a 10-year period. The benefit was even greater for people who'd suffered a heart attack: those who had a dog at home after their heart attack had a 65% lower risk of dying.

The second study, by Tove Fall and colleagues at Uppsala University, focused on the benefits of owning a dog for people who have had a heart attack or a stroke. They used the Swedish National Patient Register to identify 335,000 patients who'd suffered one of these events between 2000 and 2012, about 5% of whom were dog owners. They found even greater benefits than the first study: among people who'd had a heart attack, their risk of dying was 33% lower if they owned a dog as compared to people who lived alone. The benefits were smaller but still significant for people who lived with a companion (a spouse or a child): they still had a 15% lower risk of dying if they also owned a dog. For those who'd had a stroke, the risk of dying for dog owners was 27% lower than for people who lived along, and 12% lower than for people who lived with a companion but didn't have a dog. This study measured the risk over a 4-5 year followup period.

These studies are consistent with many other scientific reports, stretching back decades. They're all consistent, and they all point in the same direction: dog ownership is good for your health. In fact, back in 2013 the American Heart Association issued an official statement on "Pet Ownership and Cardiovascular Risk" with this recommendation:
"Pet ownership, particularly dog ownership, may be reasonable for reduction in cardiovascular disease risk."
However, because the evidence was not very strong, the AHA also advised that people shouldn't get a pet "for the primary purpose of reducing CVD risk." In other words, don't get a dog if you don't want one. As every dog owner knows, owning a dog is much more trouble than simply taking a daily pill.

The new studies strengthen the existing evidence for the health benefits of owning a dog. In an accompanying editorial in Circulation, Dhruv Kazi from Harvard Medical School asks a critical question: is the association between dog ownership and reduced mortality just a correlation, or is it causal? He points out that studies have shown that dog ownership reduces blood pressure and other signs of stress, and that dog owners tend to get outside and walk more (with their dogs). Thus it's very plausible, medically speaking, that dog ownership is good for you. For these and other reasons, Kazi concludes that
"the association between dog ownership and improved survival is real, and is likely at least partially causal."
One final question is still nagging at me, though. Now that we know that dog ownership is good for your health, what's the optimal dose? Would it be even healthier to own two dogs rather than one? And what if we throw in a cat, does that strengthen or reduce the effect? Finally, is it healthier to own a larger dog, or is a small one just as good?

Clearly, more research is needed.

[Note: the author discloses that he owns a rescue dog, a rather small terrier.]

Can You Improve Your Memory With A Jellyfish Protein?

Some colleagues of mine recently asked me about Prevagen, a supplement that is being advertised heavily on television as a memory booster. It's everywhere, they said–but what is it? And does it work?

Both questions are pretty easy to answer. On the first question, the TL;DR version is that Prevagen's primary ingredient is a protein called apoaequorin, which is found in a species of jellyfish that glows in the dark. These jellies produce two proteins, apoaequorin and green fluorescent protein (GFP), that help them fluoresce. It’s an amazing biological system, and the three scientists who discovered and developed the chemistry of GFP were awarded the 2008 Nobel Prize in chemistry.

Cool science! But what does this have to do with human memory? Not much, it turns out.

First let's examine what Prevagen's manufacturers, Quincy Bioscience, say about it. Their website claims that:
"Prevagen Improves Memory* 
Prevagen is a dietary supplement that has been clinically shown to help with mild memory loss associated with aging.* Prevagen is formulated with apoaequorin, which is safe and uniquely supports brain function.*"
Sounds pretty clear, right? But note the asterisks by each of these claims: if you scroll all the way down (or read the small print on their packages), you'll find out that:
"*These statements have not been evaluated by the Food and Drug Administration. This product is not intended to diagnose, treat, cure or prevent any disease."
You may recognize this language: it's what all supplement manufacturers use to avoid getting in trouble with the FDA. It means, essentially, that the government hasn't approved Prevagen to treat anything, including memory loss.

Despite Quincy’s claims, I see no reason why eating this protein would have any effect at all on brain function. First of all, it’s not even a human protein, so it's unlikely to work in humans. Second, even if it did work in humans, eating it would not deliver it to our brains, because it would be almost certainly be broken down in the stomach. And third, the connection between any protein and memory is very complex, so simply having more of a protein is very, very unlikely to improve memory.

Quincy's website points to a single study that they themselves conducted, which they argue showed benefits for people with mild memory impairment. However, others have pointed out that the experiment (which was never published in a scientific journal) didn't show any such thing: overall there was no difference between people who took Prevagen and those who took a placebo, but the manufacturer's did some p-hacking to extract a subgroup that appeared to get a benefit. As Dr. Harriett Hall and others have pointed out, this kind of p-hacking is bogus.

And what about my observation that the jellyfish protein will simply be digested in the stomach, and never make it to the brain? It turns out that the company itself admits that I'm right. On their website, they have a "research" page pointing to several safety studies, designed to show that Prevagen won't cause an immune reaction. One of these studies explains that
"Apoaequorin is easily digested by pepsin."
Pepsin is the chief digestive enzyme in your stomach. So Prevagen's main ingredient never gets beyond the stomach, which is why it's probably quite safe. (Joe Schwarcz at McGill University recently made the same point.)

Back in 2015, I asked Ted Dawson, the Abramson Professor of Neurodegenerative Diseases at Johns Hopkins School of Medicine, what he thought of Prevagen’s claims.
“It is hard to evaluate Prevagen as to the best of my knowledge there is no peer-reviewed publication on its use in memory and cognition,” said Dawson. “The study cited on the company’s web site is a small short study, raising concerns about the validity of the claims.”
Finally, a word to those who are still tempted to try Prevagen: it isn't cheap. Their website charges $75 for a bottle of 60 pills, each containing 10 mg of apoaequorin, or $90 for 30 pills of the "Professional formula," which contain 40 mg. (Note that there's no evidence that taking a higher dose will work any better.)

The FTC sued Quincy Bioscience in 2017 for deceptive advertising, arguing that claims that Prevagen boosts memory are false, and that claims it can get into the brain are also false. Just a few months ago, a judge ruled that the case can proceed. Meanwhile, though, the advertising and sales of Prevagen continue. The FTC case states that Quincy sold $165 million worth of Prevagen in the U.S. from 2007 to 2015.

So the bottom line is: jellyfish proteins are very cool, but eating them won't improve your memory. If you're interested in brain food, perhaps you just eat more fish, which might actually work.

(Note: I wrote about Prevagen in 2015, and some elements of this article are based on my earlier one.)

How to live longer: eat less red meat and more nuts. And be optimistic.

Will eating more red meat kill you? It just might–but there's something you can do about it.

Red meat has long been implicated in some very bad health outcomes, especially heart disease and colon cancer. And yet people keep eating it, for the simple reason that it tastes good (though not to everyone, of course).

A recent study out of Harvard and Fudan University, published in the BMJ, put some numbers on the risk of eating red meat. The authors used two very large studies, one of men and one of women, to ask a simple question: does eating red meat make it more likely that a person will die?

The answer is yes: in the study, men and women who ate more red meat–half a serving more per day–had about a 9% greater risk of death (from any cause), over the course of an 8-year followup period. Processed meats were worse: an increase of half a serving per day led to a 17% higher risk of death.

In case you're wondering, "processed" meats are foods like hot dogs, bacon, and sausages. And if half a serving per day sounds like a lot, it's not: the scientists defined a serving as just 85 grams of beef, pork, or lamb, or a single 45-gram hot dog, or 2 slices of bacon. By comparison, a quarter-pound hamburger is 115 grams. So an extra half-serving isn't very much.

(But smoked salmon lovers needn't worry: as I wrote back in August, smoked salmon is not processed meat. It's fish, which is far healthier than red meat.)

Can you lower our risk of death by reducing red meat consumption? The study looked at this question too, and the answer was, again, yes: if you replace one serving of red meat per day with whole grains, vegetables, or nuts, your risk of dying goes down, as much as 19%.

Even better is to replace one serving per day of processed meat (bacon, sausages, etc) with nuts:
"A decrease in processed meat and a simultaneous increase in whole grains, vegetables, or other protein sources was even more strongly associated with lower total mortality, with the largest reductions in risk seen with increases in nuts."
That led to a 26% reduction in the risk of death over eight years. The authors found similar results when they looked at benefits over different time spans.

The conclusion is pretty clear: replace some of the red meat in your diet with vegetables, whole grains, or nuts, and you'll probably live longer.

There's another thing you can do to avoid dying of a heart attack: be optimistic. In a completely independent study published just last week in JAMA, scientists conducted a meta-analysis of 15 earlier studies, asking whether optimism is associated with better heart health. They found that over a 14-year period, optimistic people had a 35% lower (relative) risk of cardiovascular problems and 14% lower (relative) risk of dying than pessimistic people.

There are many caveats about this study–first, it's a meta-analysis, meaning that it combines the data from many other studies. That can lead to biases, but the authors acknowledged this problem and seem to have been pretty careful to avoid it. Second, how do you measure optimism? Turns out there's a questionnaire for that, dating back 25 years, and it appears to be reliable and reproducible. Most of the studies used the same method for measuring optimism, and the benefits were quite consistent across all the studies. And it's possible that sicker people are more pessimistic, so the cause-and-effect could go either way here.

So there you have it: cut down on red meat, eat nuts instead, and stay positive. You'll live longer.





Is this drug combo a true fountain of youth?

Is rejuvenation of the thymus a key to restoring youth? Maybe it is.

A very surprising result appeared last week in a journal called Aging Cell. A team of scientists published the first results of a study that showed, in a small group of older men, that some signs of aging could be reversed with a 3-drug combination.

Not just slowed down. Reversed.

If this holds up, it could literally be life-changing for millions of people. I was initially very skeptical, having read countless claims of anti-aging treatments over the years, virtually all of which turned out to be wrong. Anti-aging treatments are a huge commercial market, full of misleading promises and vague claims. Youth-restoring skin treatments (which don't work) are a particular favorite of cosmetics companies.

But this new study is different. The scientists decided to explore whether recombinant human growth hormone (rhGH) could help to restore the thymus gland. Your thymus is in the middle of your chest, and it is part of your immune system, helping to produce the T cells that fight off infections. As we age, the thymus shrinks and starts to get "clogged with fat," as a news story in Nature put it. Hints that rhGH could help restore the thymus goes back decades, but it had never before been tested in humans.

The scientists leading the study added two more drugs, DHEA and metformin, because rhGH does carry some increased risk of diabetes. Both of these drugs help to prevent diabetes, and both might also have anti-aging benefits, although neither of them is known to affect the thymus.

Amazingly, in 7 out of 9 men in the study (it was a very small study), the thymus showed clear signs of aging reversal, with new thymus tissue replacing fat. The side effects of rhGH are very mild, and none of the men in this study had any significant problems from it or from the other two drugs.

Equally remarkable was another, unanticipated, sign of anti-aging. The study measured "epigenetic age" in all the subjects by four different methods. "Epigenetic age" refers to markers at the cellular level that change as we age, and as the study explains:
"Although epigenetic age does not measure all features of aging and is not synonymous with aging itself, it is the most accurate measure of biological age and age‐related disease risk available today."
After 9 months of treatment, the epigenetic age of the men in this study was 2.5 years younger. The treatment didn't just slow aging–it reversed it. The effects persisted in a followup 6 months later: one and a half years after the study began, the men's epigenetic age was 1.5 years younger than at the beginning. This is truly remarkable.

Any study has limitations, so I should mention a couple here. First, the study was very small, just 9 men, but the effects were strong and significant. Second, the lead scientist of the study, Gregory Fahy, is the co-founder of a company called Intervene Immune that plans to market anti-aging treatments. The authors also include scientists from Stanford, UCLA, and the University of British Columbia.

A few years ago I wrote about another drug combination, dasatinib and quercetin, which showed great promise in reversing aging, but only in mice. We're still waiting to hear more about that treatment, although a test in humans showed some promise earlier this year.

The new 3-drug combination is the most promising I've seen yet. The possible benefits are enormous: as the study points out, they include lower risks for at least 8 types of cancer, heart disease, and stroke. Unlike many other anti-aging treatments, this one has genuine plausibility, and the effects on the thymus can be measured almost immediately. Let's hope this one works out; we'll all be better off if it does.

College football season is starting up. Why do universities support a sport that harms their student athletes?

For those of us in academia, September means a new school year, and all of the excitement and energy that students bring as they return to campus. Strolling around, you can feel the energy in the air.

September is also the beginning of the college football season (in the U.S.). For many students, alumni, and other fans, watching the game each week is one more fall activity they look forward to.

But now, thanks to a rapidly growing body of new research, we know that football can severely harm and even kill its players. Not right away, but years later, through a brain disease called CTE, or chronic traumatic encephalopathy. This is a frightening disorder that gradually destroys brain cells, causing memory loss, confusion, impaired judgment, and progressive dementia. Many former players die very young, in their 40s or 50s, after first suffering for years.

CTE is caused by repeated blows to the head, events that are common to football. It has grown worse in recent decades as the players have gotten bigger and stronger. Improvements in helmet technology haven't helped, and they might even have made CTE even worse, because the helmets allowed players (by their own admission) to use their heads as battering rams.

Two years ago now, a large medical study of football players' brains showed that an appallingly high percentage of those players had CTE. In that study, Boston University scientists led by Jesse Mez and Ann McKee found CTE in the brains of 110 out of 111 former NFL players (99%), and 48 out of 53 college players (91%).

As the BU scientists themselves pointed out, the former players and their families may have suspected something was wrong, and that may have motivated them to participate in the study. Thus the extremely high proportion of deceased players showing CTE in this study is certainly an overestimate. But as I wrote at the time:
"is it okay to ask young men to play football if the risk of permanent brain damage is only 50%? What if it's just 10 or 20%? Is that okay? Is football that important?"
Clearly, the answer should be no. University presidents are constantly, even obsessively, worrying about the safety of their students. Campuses have many programs in place to protect students from crime, from sexual harrassment, from emotional distress, and more. And yet every fall, they willingly–no, enthusiastically–subject the 100 or so students on their football teams to a serious risk of lifelong, life-threatening brain damage. This simply should end.

For an especially poignant story, watch this short video about Greg Ploetz, who played on the 1969 national championship football team at the University of Texas, and who died in 2015 after years of worsening dementia:
As his daughter says in the video,
"If [today's football players] knew what he went through, and what we went through as a family, there's no way that people would decide to keep playing." 
Perhaps universities could take a cue from former Baltimore Ravens player John Urschel, widely considered the smartest player in the NFL, who was pursuing a Ph.D. in mathematics at MIT while simultaneously playing pro football. Two years ago, Urschel quit, because he was worried that brain damage would destroy his ability to think clearly. And just one week ago, Indianapolis Colts' star quarterback Andrew Luck retired early because football had "wrecked his body and stolen his joy."

Brain damage may be happening to much younger players too. A study from the University of Washington last year found that 5% of youth football players aged 5-14 had experienced concussions each season. Three years ago, a mother sued the Pop Warner youth football organization after her son committed suicide at age 25. An autopsy showed that he had CTE, and the mother argued that his brain damaged was caused by his years playing youth football. The Pop Warner organization settled the suit for close to $2 million, but other lawsuits have been filed since.

As I and others have written, football and its promise of big-money television contracts has corrupted our universities. While universities build ever-bigger football stadiums and pay coaches exhorbitant salaries, they force the players to play for free. Now we know that players face a much more direct threat: long-term brain damage.

Let me ask university presidents this question as bluntly as I can: how much brain damage is acceptable for your football players? If your answer is "none," then it's time to get football out of our universities.

$545 million wasn't enough for chiropractors. Now they're lobbying Congress for much more.

Medicare currently wastes more than $545 million a year on chiropractors, as I revealed in an article last year. Wasteful as this is, it's not enough for chiropractors, who have successfully lobbied to have two Congressmen propose a new bill, HR3654, that will require Medicare to pay chiropractors for the full range of services that real doctors offer.

The American Chiropractic Association is practically rubbing its (metaphorical) hands together with glee. As they proudly point out, this endorsement of quackery is bipartisan: the bill is sponsored by two New York Congressman, Democrat Brian Higgins and Republican Tom Reed.

The idea of having chiropractors function as regular physicians is very troubling. Chiropractors do not receive proper medical training: they get their Doctor of Chiropractic (D.C.) degrees from one of a very small number of special chiropractic schools, which do not provide the full medical training that real medical schools do. Their curriculum also includes a heavy dose of pseudoscience, especially the training around subluxations.

For a detailed discussion of why chiropractors are not competent to be family physicians, I recommend this article by an experienced physician, Dr. Harriett Hall, titled "Chiropractors as family doctors? No way!" Dr. Hall goes into considerable detail explain why many of the medical practices of chiropractors are non-standard, not evidence-based, and possibly harmful. Or see this lengthy takedown of chiropractic subluxations by Sam Homola, a former chiropractor.

Many chiropractors are also anti-vaccine, unfortunately, as documented just two weeks ago in this article by attorney Jann Bellamy. Among other things, Bellamy points out that a major chiropractic conference this fall will feature a keynote talk by anti-vaccine activist Robert Kennedy Jr. (about whom I've written before).

Even more alarming, as I've explained before, is that chiropractic neck manipulation has been shown to carry a small but real risk of stroke, because it can create a tear in your vertebral arteries. For example, this report from 2016 documented a case of cerebral hemorrhage apparently caused by chiropractic manipulation. The patient in that case was a 75-year-old woman, which puts her squarely in the class of patients eligible for Medicare.

And to those chiropractors who've read this far: I'm sorry that you were hoodwinked into spending 3-4 years in a chiropractic school, paying nearly $200,000 in tuition and fees, with the promise that you'd be a legitimate medical professional. You were scammed, and I'm sorry about that. And I understand that most (perhaps all) chiropractors want to help their patients. The problem is, the training offered by chiropractic colleges is far short of a proper medical degree.

If the chiropractors' lobbying association get its way, this $545 million (annually) in wasted Medicare dollars will soon become a far higher amount–to the detriment of patients. The bill will allow chiropractors to bill Medicare for pretty much any service that a bona fide physician offers.

It's also worth noting that in 2018, Medicare's Inspector General issued a report titled "Medicare needs better controls to prevent waste, fraud, and abuse related to chiropractic services," which revealed that almost half of Medicare spending on chiropractic care from 2010-2015, between $257 million and $304 million per year, was likely wasted or fraudulent. One wouldn't think this is a time to expand Medicare's coverage of chiropractic.

Congress, don't be fooled by arguments that this proposed new law will lower medical costs, or give patients what they need: it won't. Instead, it will dramatically increase the amount of funds wasted on ineffective treatments. The U.S. does need a better health care system, but this bill would be a big step in the wrong direction.

Hey, NY Times: Keep your hands off my smoked salmon

For lovers of smoked salmon, the New York Times featured an alarming headline last week: "Do Lox and Other Smoked Fish Increase Cancer Risk?" The article reported that the American Institute for Cancer Research, a respected nonprofit organization, considers smoked fish (including lox) to be in the same category as "processed meat." The Times answers its own question with "it might."

The Times is wrong. If anything, smoked salmon is good for you. Let me explain.

Where does the concern come from? In 2015, a major report from the International Agency for Research on Cancer (IARC) concluded that red meat and processed meat probably cause cancer, especially colon cancer. To be precise, they wrote that
"there is sufficient evidence in human beings for the carcinogenicity of the consumption of processed meat."
And what, you might ask, is processed meat? According to the IARC:
"Processed meat refers to meat that has been transformed through salting, curing, fermentation, smoking, or other processes to enhance flavour or improve preservation."
Although the 2015 IARC report didn't mention smoked fish, the NY Times reporter, Sophie Egan, points out that smoked salmon (lox) is also transformed through salting and smoking (or curing, if you consider gravlox). To support this concern, Egan quotes Alice Bender, a dietitian (with a master's degree but not a doctorate) from the American Institute for Cancer Research. According to Bender, who was not involved in the IARC report,
"Even though it’s possible that processed fish and even chicken and turkey could be better alternatives [to processed meats], for now we have to look at all of it as processed meat."
No, we don't.

I read the IARC report, and it doesn't mentioned smoked fish. It states that processed meats usually "contain pork or beef, but might also contain other red meats, poultry, offal (eg, liver), or meat byproducts such as blood." And an earlier report gave these examples of processed meats: ham, bacon, sausages, blood sausages, meat cuts, liver paté, salami, bologna, tinned meat, luncheon meat, and corned beef. Nothing about fish.

However, I wanted to be certain, so I dug down into the original research. The IARC report is based on a whole raft of earlier studies, which they combined and summarized, and it turns out that some of those studies did indeed look at smoked fish.

In particular, this IARC study from 2007–one of the studies that the 2015 IARC report relied upon–looked at both meat and fish and how they affected the risk of colon cancer. The 2007 study found that consumption of fish reduced the risk of cancer. And most important for today's discussion, they stated explicitly that
"Fish included fresh, canned, salted, and smoked fish."
There you have it. Consumption of fish, including smoked fish, reduces the risk of colon cancer. (A minor caveat: smoked salmon does have a high level of salt, which can be a concern for people with high blood pressure.)

So my response to the NY Times: keep your hands off my bagels and lox. Really, you should know better.

The US will try treating opioid addiction with fake medicine

If you can't afford to offer real medical care, why not offer fake medicine? The U.S. Medicare system is about to give this strategy a try, for treating back pain.

Last week, Medicare announced that it wants to start paying for studies of acupuncture as a treatment for low back pain, as reported by the Washington Post and Stat. The government's reason, according to Secretary of Health and Human Services Alex Azar, was that we need this option to help solve opioid addiction:
“Defeating our country’s epidemic of opioid addiction requires identifying all possible ways to treat the very real problem of chronic pain, and this proposal would provide patients with new options while expanding our scientific understanding of alternative approaches to pain.”
If you break down HHS Secretary Azar's statement, it's mostly correct. Yes, treating opioid addition should explore all methods for treating chronic pain. And yes, this program will provide "new" options, even though the option in question is nonsense.

But (3) no, studying acupuncture will not expand our scientific understanding of "alternative approaches" to pain. Why not? Because thousands of studies have already been done, and the verdict was in, long ago, that acupuncture is nothing more than an elaborate placebo.

The problem is, acupuncture proponents never give up. Every time a study shows that acupuncture fails (and this has happened, repeatedly), they claim it wasn't done properly or make another excuse. I've even seen proponents argue that studies in which acupuncture failed were in fact successes, because acupuncture and placebo treatments both outperformed the "no treatment" option.

(Aside: we use placebo treatments because we've known for decades that any treatment, even a sugar pill, may show a benefit as compared to no treatment at all. Acupuncture research has created placebos by using fake needles that don't actually pierce the skin, or by placing needles in random places rather than the so-called acupuncture points. Scientifically speaking, if a treatment doesn't outperform a placebo, then the treatment is a failure.)

To make matters worse, the new HHS program will fund "pragmatic" clinical trials rather than the usual, gold-standard randomized trials (RCTs). Without going into details, let's just say that pragmatic trials are much less well-controlled than RCTs, allowing more room for mistakes and misinterpretation. This is a bad idea even when the intervention being studied is legitimate. It's an even worse idea here, where trials have shown, over and over, that acupuncture doesn't work.

Secretary Azar might be confused because the acupuncture industry has managed to get hundreds of studies published, many of them positive–but most of them are poorly designed, and who has time to read all that bad science? (The rare well-designed studies always show that acupuncture doesn't work.) Acupuncturists have even created pseudoscientific journals devoted entirely to acupuncture, as I wrote about in 2017. Some of these journals are published by respected scientific publishers, but they are still little more than fake journals.

Not surprisingly, with entire journals trying to fill each issue with acupuncture articles, last week's Medicare announcement noted that
"the agency [CMS] recognizes that the evidence base for acupuncture has grown in recent years". 
No, it hasn't. What has grown is the number of articles. Adding more garbage to a pile doesn't make it smell better.

For those who aren't familiar with the claims of acupuncture, let's do a very quick summary: acupuncturists stick needles in a person's body at specific points in order to manipulate a mystical life force that they call "qi" (proounced "chee"). This idea is "a pre-scientific superstition" that has no basis in medicine, physiology or biology, and has never had any good scientific evidence to support it.  Acupuncturists don't even agree on where the acupuncture points are, which should make it impossible to do a scientific study. It's not at all surprising that acupuncture doesn't work; indeed, if it did work, modern medicine would have to seriously examine what mechanism could possibly explain it.

But wait, argue proponents, what about all the wise traditional doctors in China who developed acupuncture over thousands of years? Well, it turns out that acupuncture wasn't popular in China until the mid-20th century, when Chairman Mao pulled a fast one on his population because he couldn't supply enough real medical care. Mao didn't use acupuncture himself and apparently didn't believe in it. I highly recommend this expose of Mao's scam, by Alan Levinovitz in Slate.

So rather than spend millions of dollars on yet another study of acupuncture for pain, I have a better suggestion for HHS: invest the funds in basic biomedical research, which has had a flat budget for more than a decade now. As long as it goes through proper peer review, almost any research will be far better than wasting the money on acupuncture.

Now, I'm not naive enough to think that Medicare will take my advice, but I can tell them right now what their new "pragmatic" trials will reveal. Acupuncturists will happily take the money, treat people suffering from back pain, and report that some of them experienced reductions in pain. Some of the patients will invariably agree, because back pain comes and goes, and it's hard to know why it went away.

Then the acupuncturists will say, "look, it works! Now please cover acupuncture for all Medicare patients." Then we'll spend more tax dollars on pseudoscience, and patients will be in just as much pain as ever. If Medicare falls for this (and I fear they will), then Chairman Mao will have fooled the U.S. government, just as he fooled many of his own people half a century ago.

The loneliest word, and the extinction crisis

We're in the midst of an extinction crisis. Just two months ago, an international committee known as IPBES released a report, compiled over 3 years by 145 experts from 50 countries, that said 1,000,000 plant and animal species are threatened with extinction, many within the next few decades.

Martha, the very last passenger
pigeon, shown when she was
still alive.
Before getting to that report, I want to introduce a word that I only just learned: endling. An endling (the word was coined in 1996) is the last surviving member of a species. One example was Martha, the very last passenger pigeon, who died in the Cinncinnati Zoo in 1914. Passenger pigeons numbered in the billions in the 19th century, but humans wiped them out.

In 2012 we lost another endling, Lonesome George–the very last Pinto Island tortoise from the Galapagos Islands–who died at around age 100.

If you want to see a particularly poignant example of an endling, watch this rare and heartbreaking video of Benjamin, the very last Tasmanian tiger (or thylacine), pacing around his cramped enclosure in Hobart, Tasmania. This film from 1933 is the last known motion picture of a living thylacine. Benjamin died in 1936.
Two Tasmanian tigers in the Washington, D.C. zoo, in a photo
taken around 1904. Photo credit: Baker; E.J. Keller. from the
Smithsonian Institution archives

We have records of other endlings too: the last Caspian tiger was killed in the 1950s in Uzbekistan, and the last great auks were killed for specimen collectors in 1844.

Unfortunately, we're likely to see more and more endlings in the years to come. The causes of extinction are varied, and many of them are related to human activities. The IPBES ranked the culprits, in descending order, as:

  1. changes in land and sea use,
  2. direct exploitation of organisms,
  3. climate change,
  4. pollution, and
  5. invasive alien species.

In response to the IPBES report, the House of Representatives held a hearing in May to discuss the findings. Republicans on the committee took the opportunity to display a new form of denialism: extinction denialism. As reported in The Guardian, Representatives Tom McClintock and Rob Bishop used their time to attack the reputations of the report's authors, rather than addressing the very serious consequences of large-scale extinction. They called two climate-change deniers as witnesses, who also used their time to attack the authors.

This is a classic strategy used by deniers: attack the messenger, rather than dealing with the substance of the report. Let's consider just a few of the report's main findings (see much more here):

  • Across the planet, 75% of the land and about 66% of the marine environments have been significantly altered by human actions.
  • Up to $577 billion in annual global crops are at risk from pollinator loss (bees and other insects)
  • In 2015, 33% of marine fish stocks were being harvested at unsustainable levels; 60% were maximally sustainably fished.
  • Plastic pollution has increased tenfold since 1980, 300-400 million tons of heavy metals, solvents, toxic sludge and other wastes from industrial facilities are dumped annually into the world’s waters, and fertilizers entering coastal ecosystems have produced more than 400 ocean ‘dead zones’, covering a combined area greater than that of the United Kingdom.

The report is a call to action. It explains that transformative change is needed to protect and restore nature, and collective action is needed to overcome special interests such as the fossil fuel industry, which donates heavily to politicians. The Congressional hearing was a vivid demonstration of how effective the anti-environmental lobbyists have been.

Endling is the saddest word in any language. If we humans continue to treat nature as we've done in the past, we're going to see many more videos like the one of Benjamin, the last Tasmanian tiger. Let's hope we can do better.

Does the length of your fingers predict sexual orientation?

Imagine my surprise last week when I saw an article in Science that claimed "finger lengths can predict personality and health."* Huh?

The author, science writer Mitch Leslie, gives us the rather startling number that over the past 20 years, more than 1400 papers have been published linking finger lengths to personality, sexual orientation, cardiovascular disease, cancer, and more.

What is this magical finger length ratio? Simple: it's the ratio between the lengths of your index (2nd) and ring (4th) fingers, also called the 2D:4D ratio. Take a look: is your index finger longer than your ring finger?

It turns out that most people have slightly longer ring fingers than index fingers, and in men the difference is a bit larger. If the ringer finger is longer, than the 2D:4D ratio is less than one. One recent study reported that this ratio was 0.947 in men and 0.965 in women. Another study found average values of 0.984 and 0.994 for men and women. Not only is this a tiny difference, but in every study, the 2D:4D ratio among men and women overlapped, meaning the number alone doesn't tell you very much.

Nonetheless, some researchers have taken this tiny physiological difference and run with it. Nearly 20 years ago, Berkeley psychologist Marc Breedlove (now at Michigan State) published a study in Nature where he and his colleagues measured finger-length ratios in 720 adults in San Francisco. Based on this data, they concluded that finger-length ratios show
"evidence that homosexual women are exposed to more prenatal androgen than heterosexual women are; also, men with more than one older brother, who are more likely than first-born males to be homosexual in adulthood, are exposed to more prenatal androgen than eldest sons."
Whoa! They are not only claiming that the 2D:4D ratio is predictive of homosexuality, but also that exposure to prenatal androgen is the root cause of both finger lengths and sexual orientation. (Confusing correlation with causation, perhaps?) Not surprisingly, this claim is not widely accepted.

There are many, many more claims out there. In 2010, the BBC boldly reported that 
“The length of a man's fingers can provide clues to his risk of prostate cancer, according to new research.”
based on this study in the British Journal of Cancer. That study found that men whose index fingers were longer than their ring fingers had a reduced risk of cancer. (I don't believe it for a second, but if it makes you feel better, go ahead.) And a 2016 report found that both men and women with a low 2D:4D ratio (longer ring fingers) had better athletic abilities. 

The Science article goes on to explain, though, that "the results often can't be replicated." Most of these studies are small, the measurement techniques vary widely, and efforts to reproduce them (when others have tried, which isn't that often) usually fail. It didn't take me long to find a few, such as this study from 2012, which swas the 2nd failure to replicate a result claiming a link between sex hormone exposure and the 2D:4D ratio.
The author's left hand

After reading the whole Science article, one comes away with the impression that finger ratio science is almost certainly bogus. The presentation, though, gives far more space to the claims of those who believe in it, and one gets the strong impression that the journalist (Mitch Leslie) is on their side. A hint to that is in his last sentence where, after saying that the two sides are "talking past one another," he writes "more than 20 papers using the digit ratio have already come out last year."

And since the last sentence is often a giveaway for what the writer really thinks, let me conclude by saying that both my ring fingers are longer than my index fingers.

[*The print version of Science contains precisely this claim in the subheading to the article: "Some researchers say a simple ratio of finger lengths can predict personality and health." Interestingly, the online version of the same article does not have this headline. Instead, it reads "Scientists try to debunk idea that finger length can reveal personality and health." It appears as if the online editors were more skeptical than the print editors.]