Let the 2010s be the end of the post-truth era

The 2010s are over, and the double-20s are about to begin. One phrase that I'd like to never hear again is "post-truth era," an idea that has gained prominence during the past decade, especially the last few years.

One dominant theme of post-truthery is that every person has a right to his/her own interpretation of the facts, and that these alternative interpretations–or "alternative facts," as Trump adviser Kellyanne Conway famously called them–deserve to be taken seriously. They don't.

And yet, like most of the pseudoscience that I've been writing about for years now, we have to continue calling out nonsense for what it is, because some of it is harmful, and even deadly. Politics is a never-ending font of post-truthery (or truthiness, as Stephen Colbert defined it in 2005), but here I'm concerned about science. Today I'll highlight three anti-truth campaigns that have caused great harm over the past decades, in the hope that they will soon fade away.

Among serious scientists, truth is not being questioned. Indeed, the essence of science is the search for objective truth. Sometimes, though, a scientific discovery threatens to undermine the power or profits of an influential industry, and that's when industry resorts to denialism (which is essentially the same thing as post-truthiness).

Perhaps the most well-documented example of organized science denialism dates back to the 1950s, when accumulating evidence made it clear that smoking causes cancer. The tobacco industry didn't want to admit this, even though their own internal research supported it, because it meant that their main product was killing their customers, which in turn was terrible news for their business. In response,
"the tobacco companies helped manufacture the smoking controversy by funding scientific research that was intended to obfuscate and prolong the debate about smoking and health." (Cummings et al. 2007).
Eventually, after decades of lawsuits and literally millions of smoking-related deaths, the industry was forced to admit the truth and pay out billions of dollars in settlements in the U.S. Nonetheless, many people still smoke, and even as recently as 2016, the largest tobacco company in the U.S., Philip Morris, was still trying to deny the science about cigarettes.

My second example is more recent, and potentially even more harmful to the human species. You might have already guessed it: I'm talking about climate change denialism. For several decades, the evidence has been building that the planet is getting warmer. A series of reports from the Intergovernmental Panel on Climate Change (IPCC) warned, in increasingly confident terms, that humans were the primary cause of recent warming, mostly due to our use of fossil fuels and the carbon they emit into the atmosphere.

The current IPCC report states unequivocally that humans have already caused 1° C of warming, and that the warming will increase rapidly over the next several decades.

Scientifically, these facts are not in dispute. The world is already experiencing more severe storms, unprecedented floods and droughts, and die-offs due to warming oceans (such as the massive and tragic die-off of coral in Great Barrier Reef).

However, the fossil fuel industry sees global warming as a threat to their profits. Rather than invest in new, cleaner forms of energy, large companies such as Exxon-Mobil and billionaire coal magnates such as the Koch brothers have poured countless millions of dollars into disinformation campaigns to cast doubt on the science of climate change. Prominent among these efforts is The Heartland Institute, a fossil-fuel-funded organization whose main mission is to cast doubt on the science of climate change. (Heartland has also worked to cast doubt on the link between smoking and cancer.)

Climate change denialists learned from the tobacco companies that it was possible to delay government action by decades, simply by casting doubt on the science. Unfortunately, their campaigns have been working. For years, many U.S. politicians denied that the planet was getting warmer. As that argument has become increasingly harder to make with a straight face, they've changed their strategy, and now they might admit that the planet is heating up, but deny that human activities are responsible. They're still wrong.

The goal of this denialism is, simply put, to protect the profits of fossil fuel industries. However, truth doesn't care if you believe it or not. The world is getting hotter. Australia just had its four hottest days in recorded history. The unprecedented heat wave has led to hundreds of fires through much of the country, threatening every major city on the continent. There's no reason to think this won't keep happening.

Climate change denialists are only fooling themselves–after being duped by oil and coal companies.

My third and final example is one that has frustrated me for 15 years now, and it's one that just won't go away: the anti-vaccination movement. This is one of the most frustrating examples of post-truth wrongheadedness, in part it seems so unnecessary, and because so many children have been harmed as a result.

First, let's be clear about the facts: vaccines are one of the greatest boons to human health in the history of science and medicine. People used to die by the millions from smallpox, polio, and a dozen other infections that are now almost completely preventable by vaccines.

Smallpox was 100% eliminated from the planet in 1980, in one of humankind's greatest public health triumphs. Polio has now been eradicated from nearly all countries, and a campaign that started in 1988 has now reduced polio to just a few countries and fewer than 100 cases worldwide. Both of these successes are due to vaccines.

Many other diseases, including measles, chicken pox, and bacterial meningitis, have been reduced so dramatically by vaccines that physicians in the U.S. and Europe almost never see a case. This graph shows the dramatic effect of the measles vaccine, which was introduced in the U.S. in the early 1960s:
Measles cases in the United States by year, 1954-2008. Source: CDC.
The graph shows the number of reported cases, but actual cases were much higher, likely around 4 million cases per year by one estimate. These are the facts.

So what happened? The modern anti-vax movement, led by a small number of extremely vocal, extremely self-confident individuals, began in 1998 with the publication of a fraudulent study (later retracted when the fraud was uncovered) claiming that vaccines caused autism. The study was led by former physician Andrew Wakefield, who later lost his medical license because of his fraud, which included misleading his co-authors and mistreating patients.

Nonetheless, this study was picked up and amplified by several celebrities with large followings, including former Playboy model and MTV host Jenny McCarthy and political activist Robert F. Kennedy Jr, the nephew of former US President John Kennedy. RFK Jr.

Anti-vax activism is all over the web today, despite many efforts to quash it. Countless claims of "my child got sick after his vaccine jab" are presented as proof that vaccines cause harm, and under post-truthism, we're supposed to take such claims seriously.

These stories can be especially difficult to respond to, because many parents are dealing with children who have genuine health problems–including autism–and the parents often truly believe what they are saying. Thus it won't do to tell them to be quiet and go away: their children really do need medical care. They're simply wrong in believing that vaccines have anything to do with their childrens' problems. Their beliefs are amplified by Facebook groups and websites whose sole purpose is to echo and amplify the mistaken claims of anti-vaxxers.

Even though there was never any good evidence that vaccines cause autism, scientists have conducted dozens of studies involving literally millions of children to answer precisely that question, and the evidence is very, very clear: vaccines do not cause autism, nor do they cause any other systematic neurological or behavioral problems. Vaccines do prevent diseases, though, and unvaccinated children can and will get sick. Recent experience has also given us, tragically, many examples of children who died from entirely preventable infections, because their parents didn't vaccinate them. These tragedies didn't have to happen.

Every era is the age of truth. The idea that we're in a "post-truth era," despite being repeated thousands of times in articles, essays, and op-eds over the past decade, is a commentary on people's ability to fool themselves, not on the state of the world. Truth describes the world as it is, and those who choose to deny it might win a short-term argument, but in the long term, they will always lose.

Finally, on a more positive note, I'm somewhat heartened by efforts to educate college students on how to recognize and counter bogus ideas, such as the University of Washington course, Calling Bullshit, created by Carl Bergstrom and Jevin West. (Yes, that really is its title.) As the professors wrote,
"The world is awash in bullshit.... We're sick of it. It's time to do something, and as educators, one constructive thing we know how to do is to teach people." 
If you're interested, they've made the lectures available as free videos on YouTube.

Labelling an era as post-anything has always, to me, suggested a lack of substance, as if the era is defined only by what came before it. The use of "post-" also ignores that fact that the next era will need a name too. Should we expect "post-post-truth" to come next, and what would that look like? Let's hope that the 2020s will give us a return to simple respect for truth. The future of civilization might very well depend on it.

Can the Apple Watch monitor heart health?

Can the Apple Watch accurately detect atrial fibrillation?

When I first heard that scientists were conducting a study to answer this question, I was deeply skeptical. A simple wrist device detecting heart arrhythmias? This seemed too simplistic to be possible.

But a new study, just published in the New England Journal of Medicine, lays out some pretty compelling evidence that the Apple watch can do just that.

Atrial fibrillation (or atrial flutter) is a type of irregular heartbeat that is the most common cardiac arrhythmia in the U.S., affecting some 6 million people a year. The NEJM article estimates that the lifetime risk for "a-fib" might be as high as 1 in 3. Atrial fibrillation isn't necessarily a problem on its own, but it can greatly increase the risk of strokes. Many people have episodes of a-fib without even being aware of them, which is why a simple, convenient way of detecting them could be medically valuable.

The Apple Watch study was truly enormous, with 419,297 subjects monitored for about 4 months. Only a company like Apple, with a popular product like its watch, can even hope to recruit this many people to join a study. The idea was pretty simple: use the Apple Watch to detect an irregular pulse, which might be sign of atrial flutter or arrhythmia.

During the course of the study, 2,161 subjects had at least one report of an irregular pulse, about 0.5% of the total. Each of these subjects was then sent an electrocardiogram (ECG) patch, which they were supposed to wear for several days to determine if they really were having arrhythmias.

Of course, not everyone wore the ECG patch as they were supposed to, but in the end, 450 subjects wore the ECG patch (for an average of about 6 days), filled in all the required questionnaires, and returned the patch for analysis. Among these subjects, 34% of them genuinely did have atrial fibrillation as confirmed by the ECG patch. Some of them had only brief episodes, but a subgroup had nearly continuous symptoms.

The study also tried to determine the false positive rate of the Apple Watch warnings–that is, how often did it report an irregular pulse when the subject was not experiencing a-fib or atrial flutter? The researchers evaluated all of the reports from watches that were being worn by people who also had an ECG patch. In this analysis, 71% of the irregular pulse reports from the watch corresponded to atrial fibrillation simultaneously measured by the ECG patch. The other 29% weren't normal either: three-fourths of those reports were due to "frequent premature atrial contractions."

So overall, the Watch was surprisingly accurate, with an impressively low false positive rate.

How does it work? Well, the back of the watch contains several sensors that detect light (photodiodes), along with green and infrared LEDs that emit light. As Apple's website explains,
"By flashing its LED lights hundreds of times per second, Apple Watch can calculate the number of times the heart beats each minute."
This works because your skin is partially transparent: as everyone knows, you can see some of your blood vessels underneath your skin. In addition to measuring blood flow, the watch can also measure electrical signals using electrodes on the back of the watch and on a small dial on the side of the watch, called the Digital Crown. Here's how Apple explains this:
"When you place your finger on the Digital Crown, it creates a closed circuit between your heart and both arms, capturing the electrical impulses across your chest."
In other words, it behaves like an ECG monitor on your wrist. The NEJM study didn't use this feature of the Apple Watch, which suggests that the watch could be even more effective as a heart monitor in the future.

The Apple Watch is far from perfect, though. For one thing, we don't know how many episodes of atrial fibrillation it missed. Only 0.5% of subjects had a report of irregular pulse, but we don't know how many of the remaining 99.5% had an episode that the watch didn't detect. The authors of the study pointed out that they weren't trying to measure the watch's sensitivity, and they emphasized that
"the absence of an irregular pulse notification does not exclude possible arrhythmias."
Another caveat is that the study was funded by Apple, although it was led by scientists at Stanford and included scientists from multiple other highly regarded universities and medical schools. The funding was clearly disclosed in the NEJM article.

On the other hand, using the Apple Watch is far easier than other currently available procedures for monitoring your heart. Patients who have episodes of arrhythmia are typically told to wear a heart-rate monitor for days or weeks at a time. This involves taping electrodes to half a dozen places on your body, all of which are connected by wires to a device (basically a cell phone) that records the readings and sends them to a monitoring company. These monitors are very expensive to operate, far more than the Apple Watch.

So despite its imperfections, the Apple Watch might be the vanguard of a new wave of lightweight, less intrusive devices for monitoring our health. The technology is only going to get better.