Should we all be on statins? (reprise)

Should you be on statins? New guidelines and an online calculator may allow you to answer this question yourself.

Back in 2011, I asked whether we should all be on statins. At the time, it was clear that statins offered benefits for people who had already suffered heart attacks or other serious cardiovascular problems. But for the rest of us, it wasn't clear at all. A number of studies had been published suggesting that millions more people (in the U.S. alone) might benefit from statin therapy, but most of those studies were published by drug companies that made statins. As I wrote at the time, "we need more data from completely unbiased studies."

So has anything changed? Actually, it has. Last year, the U.S. Preventative Services Task Force (USPSTF) reviewed all of the evidence and updated its former (from 2008) recommendations. The evidence now suggests that some people–even those who have never suffered a heart attack–would benefit from statins.

Here's what the current USPSTF recommendations suggest. If you've never had a heart attack and have no history of heart disease, you still might benefit from statins if:

  • you're 40-75 years old,
  • you have one or more "risk factors" for cardiovascular disease (more about this below), AND
  • you have a 10-year risk of cardiovascular disease (CVD) of 7.5%-10%, using a "risk calculator" that I'll link to below.

Now let's look at those risk factors for CVD. There are four of these, and any one of them puts you in the category of people who might benefit from statins: diabetes, high blood pressure (hypertension), smoking, or dyslipidemia.

Most people already know their status for the first 3, but "dyslipidemia" needs a bit more explanation. This is simply an unhealthy level of blood cholesterol, defined by USPSTF as either "an LDL-C level greater than 130 mg/dL or a high-density lipoprotein cholesterol (HDL-C) level less than 40 mg/dL." You can ask your doctor about these numbers, or just look at your cholesterol tests yourself, where they should be clearly marked.

For that last item, how do you calculate you 10-year risk of CVD? Most people should ask their doctor, but if you want to see how it's done, the calculator is at the American College of Cardiology site here. It's quite simple and you can fill it in yourself to see your risk.

A big caveat here, as the USPSTF explains, is that the "risk calculator has been the source of some controversy, as several investigators not involved with its development have found that it overestimates risk when applied to more contemporary US cohorts."

Another problem that I noticed with the risk calculator is that using it for the statin recommendation involves some serious double counting. That's because the risk calculator relies in part on your cholesterol levels and blood pressure, but those same measurements are considered to be separate risk factors for CVD. This puts a lot of weight on cholesterol levels–but on the other hand, statins' biggest effect is to reduce those levels.

The USPSTF is a much more honest broker of statin recommendations than industry-funded drug studies, so we can probably trust these new guidelines. Note that if the risk calculator puts you in the 7.5%-10% range, you will only get a very small benefit from statins–as the USPSTF puts it, "Fewer persons in this population will benefit from the intervention."

Don't rush to go on statins without giving it some serious thought. As Dr. Malcolm Kendrik put it last year (quoted by Dr. Luisa Dillner in The Guardian),
“If I was taking a tablet every day for the rest of my life, I would want to know how long I would have extra to live. If you take statins for five years and you are at higher risk, then you reduce the risk of a heart attack by 36%. But if you rephrase the data, this means on average you will have an extra 4.1 days of life.” 
So no, we shouldn't all be on statins. But until something better comes along (and I hope it will), they are worth considering for anyone who is in a higher-risk group for cardiovascular disease.

Clever food hacks from Cornell Food Lab might all be fake

Have you heard that serving your food on smaller plates will make you eat less? I know I have. I even bought smaller plates for our kitchen when I first heard about that study, which was published in 2011.

And did you know that men eat more when other people are watching? Women, though, behave exactly the opposite: they eat about 1/3 less when spectators are present. Perhaps guys should eat alone if they're trying to lose weight.

Or how about this nifty idea: kids will eat more fruits and vegetables at school if the cafeteria labels them with cool-sounding names, like "x-ray vision carrots." Sounds like a great way to get kids to eat healthier foods.

Or this: you'll eat less if you serve food on plates that are different colors from the food. If the plate is the same color, the food blends in and it looks like you've got less on your plate.

And be sure to keep a bowl of fruit on your counter, because people who do that have lower BMIs.

Hang on a minute. All of the tips I just described might be wrong. The studies that support these clever-sounding food hacks all come from Cornell scientist Brian Wansink, whose research has come under withering criticism over the past year.

Wansink is a professor at Cornell University's College of Business, where he runs the Food and Brand Lab. Wansink has become famous for his "kitchen hacks" and healthy-eating tips, which have been featured on numerous media outlets, including the Rachel Ray show, Buzzfeed, USA Today, Mother Jones, and more.

Last week, Stephanie Lee at Buzzfeed wrote a lengthy exposé of Wansink's work, based on published critiques as well as internal emails that Buzzfeed obtained through a FOIA request. She called his work "bogus food science" and pointed out that
"a $22 million federally funded program that pushes healthy-eating strategies in almost 30,000 schools, is partly based on studies that contained flawed — or even missing — data."
Let's look at some of the clever food hacks I described at the top of this article. That study about labeling food with attractive names like "x-ray vision carrots"? Just last week, it was retracted and replaced by JAMA Pediatrics because of multiple serious problems with the data reporting and the statistical analysis.

The replacement supposedly fixes the problems. But wait a second: just a few days after that appeared, scientist Nick Brown went through it and found even more problems, including data that doesn't match what the (revised) methods describe and duplicated data.

How about the studies that showed people eat more food when others are watching? One of them, which found that men ate more pizza when women were watching, came under scrutiny after Wansink himself wrote a blog post describing his methods. Basically, when the data didn't support his initial hypothesis, he told his student to go back and try another idea, and then another, and another–until something comes up positive.

This is a classic example of p-hacking, or HARKing (hypothesizing after results are known), and it's a big no-no. Statistician Andrew Gelman took notice of this, and after looking at four of Wansink's papers, concluded:
"Brian Wansink refuses to let failure be an option. If he has cool data, he keeps going at it until he finds something, then he publishes, publishes, publishes."
Ouch. That is not a compliment.

Soon after Gelman's piece, scientists Jordan Anaya, Tim van der Zee, and Nick Brown examined four of the Wansink's papers and found 150 inconsistencies, which they published in July, in a paper titled "Statistical Heartburn: An attempt to digest four pizza publications from the Cornell Food and Brand Lab." Anaya subsequently found errors in 6 more of Wansink's papers.

It doesn't stop there. In a new preprint called "Statistical infarction," Anaya, van der Zee and Brown say they've now found problems with 45 papers from Wansink's lab. Their preprint gives all the details.

New York Magazine's Jesse Singal, who called Wansink's work "really shoddy research," concluded that
"Until Wansink can explain exactly what happened, no one should trust anything that comes out of his lab."
In response to these and other stories, Cornell University issued a statement in April about Wansink's work, saying they had investigated and concluded this was "not scientific misconduct," but that Cornell had "established a process in which Professor Wansink would engage external statistical experts" to review many of the papers that appeared to have flaws.

And there's more. Retraction Watch lists 14 papers of Wansink's that were either retracted or had other notices of concern. Most scientists spend their entire careers without a single retraction. One retraction can be explained, and maybe two or even three, but 14? That's a huge credibility problem: I wouldn't trust any paper coming out of a lab with a record like that.

But how about those clever-seeming food ideas I listed at the top of this article? They all sound plausible–and they might all be true. The problem is that the science supporting them is deeply flawed, so we just don't know.

Finally, an important note: Brian Wansink is a Professor of Marketing (not science) in Cornell's College of Business. He is not associated with Cornell's outstanding Food Science Department, and I don't think his sloppy methods should reflect upon their work. I can only imagine what the faculty in that department think about all this.

How much brain damage is too much? NFL players head for the exits.

The smartest player in the NFL just quit.

Not because he was unable to play, and certainly not because of his age–he's only 26. No, Baltimore Ravens' player John Urschel decided to quit because the risk of permanent, irreversible brain damage is just not worth it.

Urschel is a very smart guy. He's currently pursuing a Ph.D. in mathematics at MIT, one of the best and most demanding science universities in the world. Until this summer, he was (impressively) balancing his studies with being a full-time NFL player.

But when Dr. Ann McKee and colleagues published a new study showing that 110 out of 111 former NFL players had suffered serious brain damage, Urschel could no longer pretend he wasn't putting his future at grave risk. McKee's study, the largest study yet of chronic traumatic encephalopathy (CTE), showed alarmingly high rates of CTE in college and high school players as well (91% of former college players).

Let's get one point out of the way: everyone involved with the study, including Dr. McKee, knows that it was biased. The scientists examined brains of deceased players that had been donated to the study because family members–or the players themselves, before they died–suspected something was wrong. So perhaps the true risk of brain damage is lower than 99%. Maybe it's only 50%, or 20%. Do young men playing football want to take that risk?

John Urschel isn't the first player to quit because of the growing realization that football may cause irreversible brain damage. In 2015, San Francisco 49ers player Chris Borland retired at the age of 24, and in 2016 Kansas City Chiefs player Hussain Abdullah retired at 30, both over concerns about concussions and brain damage.

The NFL has been denying or downplaying the risk for years. A few years ago, after the suicide of former player Junior Seau, they announced a $30 million partnership with the NIH to study the risks of football on the brain. As results started coming in, showing that the risk was far more serious than most people knew, the NFL backed out of the deal with $16 million still unspent.

Meanwhile, the chorus of warnings has been growing steadily louder from the medical community. Last year, a former team doctor and a former football player and coach wrote in JAMA that
"unless there is a way to reduce the number of TBIs [traumatic brain injuries] caused by the sport, football will remain a threat to the brains and health futures of the players, including impaired cognitive function and reasoning, memory loss, emotional depression, and other sequelae that profoundly erode quality of life."
Earlier this year, a study out of the CDC reported that "3 high school or college football players die each year from traumatic brain and spinal cord injuries that occur on the field," most them as a result of being tackled during games.

Over the years, football players have grown ever larger (the average NFL lineman today weighs over 300 pounds) and the intensity of the violence on the field has grown with them. It's not just in the NFL, either: last year, three high school teams in the state of Washington forfeited their games against a local team out of a legitimate fear that players would be badly injured by the opposing team's 300-plus pound linemen. Their fears were justified: the human head simply wasn't built to withstand the repeated blows that players endure.

All players might do themselves a favor by listening to John Urschel. He explained his decision–and his abiding love for the game of football–in a lengthy interview on the Freakonomics podcast a couple of weeks ago. That interview should be required listening for young players, and even more so for parents who might be dreaming that their sons have a future career in football.


Houston, we have a problem. It's called global warming, whether you admit it or not.

Hurricane Harvey poured more rain on Texas and Louisiana last week than this country has ever seen from a single storm. The city of Houston is now suffering from historic flooding, with many calling this a "1000-year flood." Congress is likely to pass a huge bailout bill in the coming days, starting with a $14.5 billion "down payment," suggesting much more is to come.

The storm's eventual costs could rise even higher than the costs of 2005's Hurricane Katrina, which cost $160 billion according to NOAA.

Let's not dance around the issue: Hurricane Harvey was a direct consequence of global warming, which in turn is a direct consequence of human activities.

It's ironic that Texas (and Houston in particular) has an economy that is dominated by on oil and fossil fuels. Burning these fuels is what got us in this mess.

It's also ironic that Texas Senator Ted Cruz, who is now at the front of the line asking for a federal government rescue package, is a scientifically illiterate climate change denier. As I wrote shortly after he announced his candidacy for President in 2015, Cruz not only denied that global warming was happening, but he then went on to compare himself to Galileo, as if he were taking a brave and bold scientific position. Right.

A few facts: the Gulf of Mexico is 4 degrees warmer than normal this year, and it has been getting worse. Back in March of this year, the Washington Post's Jason Samenow reported that the Gulf was "freakishly warm, which could mean explosive springtime storms." Warm water feeds hurricanes, and Harvey feasted on it, sucking up energy and using it to dump ridiculous amounts of water onto south Texas.

Noted climate scientist Michael Mann, writing in The Guardian, took the slightly more nuanced position that "climate change made Hurricane Harvey more deadly." True enough: if you want to be strictly accurate, we can't prove that warming temperatures are the sole cause of Harvey. Maybe with cooler temperatures, we'd have had a hurricane anyway–but it would have been a far smaller one, and the damage would have been far less severe.

Mann also pointed out that global warming has already caused sea levels to rise over half a foot, which made the flooding in Houston significantly worse than it would have been otherwise.

Now it's time to rebuild, which raises a dilemma. The U.S. can't just abandon Houston, one of our country's largest cities, even if most of its residents deny the reality of global warming (and perhaps they don't). But given that global warming is well under way, with rising sea levels and warming oceans, more catastrophic flooding events like Harvey are highly likely. Should we pay to rebuild the city exactly as it was, basically ignoring the problems of floodwater management as Houston has done until now? Or should we use the government bailout funds to reduce the risk from future flooding?

Actually, it might do even more good to impose a simple requirement, before Texas gets any of our bailout funds. Let's require U.S. senators Ted Cruz and John Cornyn, and Texas governor Greg Abbot, to state publicly that global warming is real, that humans are making it worse, and that they will work in the future to mitigate the risks posed by continued climate change. Wouldn't that be something? A simple statement, nothing more, to unlock billions of dollars in aid.

If we just rebuild everything like before, then Houston will continue to have a problem.

(*Note about the title of this article. The original quote was "Houston, we've had a problem," famously utterly by Apollo 13 astronaut Jim Lovell. In the movie Apollo 13, actor Tom Hanks (playing Lovell) instead said, "Houston, we have a problem.")

NIH institute purges climate change references, but not very well

Last week, one of NIH's institutes, the National Institute for Environmental and Health Sciences (NIEHS), did something rather mysterious. The institute purged references to climate change on its website by replacing the phrase "climate change" with "climate."

For example, a page formerly titled "Health Impacts of Climate Change" is now titled "Health Impacts of Climate," a title that obscures the main point of the page's content, which is all about climate change. Another page formerly called "Climate Change and Human Health" is now called "Climate and Human Health." Ironically, the web addresses of both these pages still contains the term 'climatechange':

  •   https://www.niehs.nih.gov/research/programs/geh/climatechange/health_impacts/
  •   https://www.niehs.nih.gov/research/programs/geh/climatechange/

which is something of a smoking gun showing the after-the-fact alterations. The attempted purge was first revealed by the nonprofit group EDGI, a group dedicated to addressing "potential threats to federal environmental and energy policy." The Washington Post revealed that these changes were made by Christine Flowers, the NIEHS Director of Communications. As quoted in the Post, Flowers explained that:
"It’s a minor change to a title page, but the information we provide remains the same. In fact, it’s been expanded."
True, the contents of these pages seem to be unchanged. But in that case, why change the titles and headings? Clearly something more is afoot. Is NIEHS trying to pretend that climate change isn't real, or that it has no effect on human health? If so, this would undermine the very mission of the institute. Are the NIEHS staff fearful that one of Trump's minions will attack them for describing objective scientific facts? If so, perhaps they should get another job.

NIEHS's attempt to re-write its own history has been woefully ineffective. It's easy to find other NIEHS webpages devoted to climate change, such as:
https://www.niehs.nih.gov/health/topics/agents/climate-change
which has the title "Climate Change" right at the top, and which links to a major report called "The Impacts of Climate Change on Human Health in the United States: A Scientific Assessment." There's also the NIEHS Climate Change and Environmental Exposures Challenge, a competition sponsored by NIEHS to create graphical visualizations showing the effect of climate change on health.

The WashPost story did not mention whether the NIEHS director, Dr. Linda Birnbaum or her boss, NIH Director Dr. Francis Collins, had plans to restore the original language to the website. I've written to them both to ask, and I'll post an update here if they do.


Reject this incompetent Trump appointee

Sam Clovis standing next to Trump during the campaign.
Trump has nominated a non-scientist to be chief scientist at the U.S. Department of Agriculture. This is an outrageous slap in the face to science. It's also a slap in the face to Congress.

As I predicted back in May, Trump has nominated Sam Clovis, a former right-wing radio talk show host and failed Senate candidate from Iowa, to be the chief scientist of the USDA. ProPublica was the first to break this story, and they also pointed out that Clovis was a vocal climate change denialist. Clovis has an undergraduate degree in politics and graduate training in business, but he has no formal training in science at all.

Clovis does have one qualification, though. As ProPublica pointed out, he has been a "fiery pro-Trump advocate on television." Sounds like a good candidate for a chief scientist job to me.

Fortunately (perhaps), the Senate has to approve this appointment. The Senate itself stipulated, in a bill that Congress passed in 2008, that the USDA's chief scientist (the Under Secretary for Research, Education, and Economics) must be appointed from
"distinguished scientists with specialized or significant experience in agricultural research, education, and economics."
The law also says, just to make it crystal clear, that the Under Secretary "shall hold the title of Chief Scientist of the Department."

Why is this appointment so wrong? I'll repeat what I wrote back in May:
Overseeing the USDA's research programs requires strong expertise in biological science. A non-scientist has no basis for deciding which research is going well, or what questions need further study, or which questions present the most promising avenues for research. A non-scientist is simply incompetent to choose among them–and I mean this in the literal sense of the word; i.e., not having the knowledge or training to do the job. This does not mean that I think Sam Clovis is incompetent at other things; I don't know him and he might be very capable in other areas. Among other problems, a non-scientist leader of a scientific agency will be incapable of using scientific expertise to set priorities, and instead can make up his own priorities.
If the Senate has any backbone at all–if Republicans are willing to show that they are capable of doing something other than rubber-stamping every action, no matter how damaging, of our self-absorbed, ignorant President–then they will turn down this nomination. Sam Clovis is so obviously unqualified that this should be easy to do.

Actually, if Mr. Clovis cared about the USDA's mission, he would recognize that he's the wrong man for the job and refuse the nomination. Even Dan Glickman, the former Secretary of Agriculture, said "I wouldn't be qualified for that job" (about himself–he's a lawyer) in a recent interview about Clovis' appointment. The current and previous Chief Scientists at the USDA have Ph.D.s and extensive scientific publication records. Mr. Clovis does not. (Note that when I wrote to Mr. Clovis in May to ask about his pending appointment, he declined to respond on the record.)

The Senate's Republicans have confirmed all of Trump's nominees so far, and I fear they will rubber-stamp this one as well. Let's hope that a few of them (and only 3 have to object, assuming all 48 Democrats vote no) realize that appointing a non-scientist to be Chief Scientist of the USDA is a slap in the face not only to science, but to Congress itself, because the appointment scoffs at Congress's own law, passed during the George W. Bush administration.

Trump can find another political appointment for Sam Clovis, as he has for other Trump loyalists. But appointing a former talk radio host, a non-scientist who has never published a single scientific paper, as the Chief Scientist of the USDA is a gross insult to the thousands of hard-working real scientists at the USDA, and to millions more who depend on, and benefit from, the USDA's research programs. Senators: do the right thing and tell Trump to appoint a real scientist to this job.

Will this be the end of college football? The risk of brain damage is startlingly high.

Parents send their kids off to college with high hopes and great expectations. Universities, in turn, have a responsibility to provide an education in an environment that supports and also challenges the students.

Universities are not supposed to encourage activities that may result in permanent brain damage.

And yet, they do. As revealed in a new report by Jesse Mez and colleagues from Boston University, just published in the Journal of the American Medical Association, a shockingly high number of former football players, from both college and professional teams, suffered chronic traumatic encephalopathy (CTE) later in life, likely as a result of their years playing football.

The study authors looked at 202 deceased former football players whose brains had been donated for research, and found that 87% of them had some degree of CTE. The highest rates of CTE were among former NFL players, which affected 110 out of 111 players. CTE was nearly as bad in college football players, though, with 91% of them (48 out of 53) suffering from CTE.

Over half of the high school players (27) had "severe pathology." The authors noted that in deceased players with severe CTE, the most common cause of death was neurodegenerative disease. As they also explain:
"There is substantial evidence that CTE is a progressive, neurodegenerative disease."
In other words, CTE is a death sentence.

The authors of the study stated their conclusions carefully, noting that the study was not randomized, and that players and their families may have been motivated to participate because they were concerned about a possible link between football and CTE. Nonetheless, even if the risk of CTE is much lower than found in this study, universities should be taking a very hard look at the risks that they are exposing their students to.

Or to put it another way, is it okay to ask young men to play football if the risk of permanent brain damage is only 50%? What if it's just 10 or 20%? Is that okay? Is football that important?

Readers of this column know my answer: no. College is not about football, and if it disappears completely, universities will be just fine. The University of Chicago eliminated its football program in 1939, and brought it back in decades later as much-reduced program, now in NCAA Division III. The university itself has remained a powerhouse, routinely ranked in the top universities in the country academically.

As I've written before, football is corrupting our universities, blinding them to their true mission (education and research) in the pursuit of a profitable entertainment business. University presidents seem helpless to stop or even slow down the enormous machine that is big-time college football. For example, in 2015 the University of Maryland (where I used to be a professor) paid millions of dollars to buy out the football coach, so that he could quit a year early and the university could pay millions more to a new coach. Ironically, Maryland had done exactly the same thing in 2011 to buy out the previous coach, at a time when the entire state had hiring and salary freezes in place. None of these actions benefitted the university or its students.

All the while, universities pretend that they are educating the players. Here's a quote from Bleacher Report's interview with star UCLA quarterback Josh Rosen:
"Look, football and school don't go together. They just don't.... No one in their right mind should have a football player's schedule, and go to school."
This from one of the top college football players in the country. (On this topic, Taylor Branch's 2011 exposé in The Atlantic is particularly worth reading. Or this article by a disillusioned former Michigan fan.)

Universities now face an ethical dilemma far more serious than merely taking advantage of athletes' skills to entertain football fans and pay inflated salaries to coaches. The JAMA study reveals that by running a football program, universities are not just robbing young men of four years that might be better spent getting an education and preparing for life: they might be robbing their students of life itself.