How to live longer: eat less red meat and more nuts. And be optimistic.

Will eating more red meat kill you? It just might–but there's something you can do about it.

Red meat has long been implicated in some very bad health outcomes, especially heart disease and colon cancer. And yet people keep eating it, for the simple reason that it tastes good (though not to everyone, of course).

A recent study out of Harvard and Fudan University, published in the BMJ, put some numbers on the risk of eating red meat. The authors used two very large studies, one of men and one of women, to ask a simple question: does eating red meat make it more likely that a person will die?

The answer is yes: in the study, men and women who ate more red meat–half a serving more per day–had about a 9% greater risk of death (from any cause), over the course of an 8-year followup period. Processed meats were worse: an increase of half a serving per day led to a 17% higher risk of death.

In case you're wondering, "processed" meats are foods like hot dogs, bacon, and sausages. And if half a serving per day sounds like a lot, it's not: the scientists defined a serving as just 85 grams of beef, pork, or lamb, or a single 45-gram hot dog, or 2 slices of bacon. By comparison, a quarter-pound hamburger is 115 grams. So an extra half-serving isn't very much.

(But smoked salmon lovers needn't worry: as I wrote back in August, smoked salmon is not processed meat. It's fish, which is far healthier than red meat.)

Can you lower our risk of death by reducing red meat consumption? The study looked at this question too, and the answer was, again, yes: if you replace one serving of red meat per day with whole grains, vegetables, or nuts, your risk of dying goes down, as much as 19%.

Even better is to replace one serving per day of processed meat (bacon, sausages, etc) with nuts:
"A decrease in processed meat and a simultaneous increase in whole grains, vegetables, or other protein sources was even more strongly associated with lower total mortality, with the largest reductions in risk seen with increases in nuts."
That led to a 26% reduction in the risk of death over eight years. The authors found similar results when they looked at benefits over different time spans.

The conclusion is pretty clear: replace some of the red meat in your diet with vegetables, whole grains, or nuts, and you'll probably live longer.

There's another thing you can do to avoid dying of a heart attack: be optimistic. In a completely independent study published just last week in JAMA, scientists conducted a meta-analysis of 15 earlier studies, asking whether optimism is associated with better heart health. They found that over a 14-year period, optimistic people had a 35% lower (relative) risk of cardiovascular problems and 14% lower (relative) risk of dying than pessimistic people.

There are many caveats about this study–first, it's a meta-analysis, meaning that it combines the data from many other studies. That can lead to biases, but the authors acknowledged this problem and seem to have been pretty careful to avoid it. Second, how do you measure optimism? Turns out there's a questionnaire for that, dating back 25 years, and it appears to be reliable and reproducible. Most of the studies used the same method for measuring optimism, and the benefits were quite consistent across all the studies. And it's possible that sicker people are more pessimistic, so the cause-and-effect could go either way here.

So there you have it: cut down on red meat, eat nuts instead, and stay positive. You'll live longer.





Is this drug combo a true fountain of youth?

Is rejuvenation of the thymus a key to restoring youth? Maybe it is.

A very surprising result appeared last week in a journal called Aging Cell. A team of scientists published the first results of a study that showed, in a small group of older men, that some signs of aging could be reversed with a 3-drug combination.

Not just slowed down. Reversed.

If this holds up, it could literally be life-changing for millions of people. I was initially very skeptical, having read countless claims of anti-aging treatments over the years, virtually all of which turned out to be wrong. Anti-aging treatments are a huge commercial market, full of misleading promises and vague claims. Youth-restoring skin treatments (which don't work) are a particular favorite of cosmetics companies.

But this new study is different. The scientists decided to explore whether recombinant human growth hormone (rhGH) could help to restore the thymus gland. Your thymus is in the middle of your chest, and it is part of your immune system, helping to produce the T cells that fight off infections. As we age, the thymus shrinks and starts to get "clogged with fat," as a news story in Nature put it. Hints that rhGH could help restore the thymus goes back decades, but it had never before been tested in humans.

The scientists leading the study added two more drugs, DHEA and metformin, because rhGH does carry some increased risk of diabetes. Both of these drugs help to prevent diabetes, and both might also have anti-aging benefits, although neither of them is known to affect the thymus.

Amazingly, in 7 out of 9 men in the study (it was a very small study), the thymus showed clear signs of aging reversal, with new thymus tissue replacing fat. The side effects of rhGH are very mild, and none of the men in this study had any significant problems from it or from the other two drugs.

Equally remarkable was another, unanticipated, sign of anti-aging. The study measured "epigenetic age" in all the subjects by four different methods. "Epigenetic age" refers to markers at the cellular level that change as we age, and as the study explains:
"Although epigenetic age does not measure all features of aging and is not synonymous with aging itself, it is the most accurate measure of biological age and age‐related disease risk available today."
After 9 months of treatment, the epigenetic age of the men in this study was 2.5 years younger. The treatment didn't just slow aging–it reversed it. The effects persisted in a followup 6 months later: one and a half years after the study began, the men's epigenetic age was 1.5 years younger than at the beginning. This is truly remarkable.

Any study has limitations, so I should mention a couple here. First, the study was very small, just 9 men, but the effects were strong and significant. Second, the lead scientist of the study, Gregory Fahy, is the co-founder of a company called Intervene Immune that plans to market anti-aging treatments. The authors also include scientists from Stanford, UCLA, and the University of British Columbia.

A few years ago I wrote about another drug combination, dasatinib and quercetin, which showed great promise in reversing aging, but only in mice. We're still waiting to hear more about that treatment, although a test in humans showed some promise earlier this year.

The new 3-drug combination is the most promising I've seen yet. The possible benefits are enormous: as the study points out, they include lower risks for at least 8 types of cancer, heart disease, and stroke. Unlike many other anti-aging treatments, this one has genuine plausibility, and the effects on the thymus can be measured almost immediately. Let's hope this one works out; we'll all be better off if it does.

College football season is starting up. Why do universities support a sport that harms their student athletes?

For those of us in academia, September means a new school year, and all of the excitement and energy that students bring as they return to campus. Strolling around, you can feel the energy in the air.

September is also the beginning of the college football season (in the U.S.). For many students, alumni, and other fans, watching the game each week is one more fall activity they look forward to.

But now, thanks to a rapidly growing body of new research, we know that football can severely harm and even kill its players. Not right away, but years later, through a brain disease called CTE, or chronic traumatic encephalopathy. This is a frightening disorder that gradually destroys brain cells, causing memory loss, confusion, impaired judgment, and progressive dementia. Many former players die very young, in their 40s or 50s, after first suffering for years.

CTE is caused by repeated blows to the head, events that are common to football. It has grown worse in recent decades as the players have gotten bigger and stronger. Improvements in helmet technology haven't helped, and they might even have made CTE even worse, because the helmets allowed players (by their own admission) to use their heads as battering rams.

Two years ago now, a large medical study of football players' brains showed that an appallingly high percentage of those players had CTE. In that study, Boston University scientists led by Jesse Mez and Ann McKee found CTE in the brains of 110 out of 111 former NFL players (99%), and 48 out of 53 college players (91%).

As the BU scientists themselves pointed out, the former players and their families may have suspected something was wrong, and that may have motivated them to participate in the study. Thus the extremely high proportion of deceased players showing CTE in this study is certainly an overestimate. But as I wrote at the time:
"is it okay to ask young men to play football if the risk of permanent brain damage is only 50%? What if it's just 10 or 20%? Is that okay? Is football that important?"
Clearly, the answer should be no. University presidents are constantly, even obsessively, worrying about the safety of their students. Campuses have many programs in place to protect students from crime, from sexual harrassment, from emotional distress, and more. And yet every fall, they willingly–no, enthusiastically–subject the 100 or so students on their football teams to a serious risk of lifelong, life-threatening brain damage. This simply should end.

For an especially poignant story, watch this short video about Greg Ploetz, who played on the 1969 national championship football team at the University of Texas, and who died in 2015 after years of worsening dementia:
As his daughter says in the video,
"If [today's football players] knew what he went through, and what we went through as a family, there's no way that people would decide to keep playing." 
Perhaps universities could take a cue from former Baltimore Ravens player John Urschel, widely considered the smartest player in the NFL, who was pursuing a Ph.D. in mathematics at MIT while simultaneously playing pro football. Two years ago, Urschel quit, because he was worried that brain damage would destroy his ability to think clearly. And just one week ago, Indianapolis Colts' star quarterback Andrew Luck retired early because football had "wrecked his body and stolen his joy."

Brain damage may be happening to much younger players too. A study from the University of Washington last year found that 5% of youth football players aged 5-14 had experienced concussions each season. Three years ago, a mother sued the Pop Warner youth football organization after her son committed suicide at age 25. An autopsy showed that he had CTE, and the mother argued that his brain damaged was caused by his years playing youth football. The Pop Warner organization settled the suit for close to $2 million, but other lawsuits have been filed since.

As I and others have written, football and its promise of big-money television contracts has corrupted our universities. While universities build ever-bigger football stadiums and pay coaches exhorbitant salaries, they force the players to play for free. Now we know that players face a much more direct threat: long-term brain damage.

Let me ask university presidents this question as bluntly as I can: how much brain damage is acceptable for your football players? If your answer is "none," then it's time to get football out of our universities.