Last week, a new study in the Archives of Internal Medicine claimed that eating chocolate more frequently is associated with a lower body-mass index, or BMI. The study's authors carefully stated their results as a "link," but then they went on to say that other evidence would "suggest the association could be causal."
Of course the media ate this up (pun intended), with headlines like "The Chocolate Diet?" at The New York Times. The Times story was filled with caveats, pointing out that it was frequency of chocolate consumption (how many times per week you eat it) that was associated with lower body weight, and that people who simply eat more chocolate tend to be, well, fatter. Surprise!
Beatrice Golomb, the lead author of the study, explained to the Times that:
“It’s not the case that eating the largest amount of chocolate is beneficial; it’s that eating it more often was favorable.”Because the thinner people in the study ate less chocolate overall, they must have eaten much less at each sitting.
Alas, this is just a bad study. The biggest problem is that it was based entirely on questionnaires, in which 1018 men and women answered questions about how often they excercise and how often they eat chocolate. Here are some key features of these subjects:
- Their average BMI was 28. So they are kind of fat. (A BMI of 25-30 is overweight.)
- They claim to exercise 3.6 times per week, on average.
- They report eating chocolate 2 times per week.
We all know how this works. Even though the questionnaire is anonymous, we still want to look like we're doing the right thing. So we over-state how often we exercise (3.6 times per week? ha!) and we under-state how often you eat unhealthy foods. The entire result can be explained by a tendency of thinner people to admit eating chocolate more often. Observational studies like this one are notoriously unreliable.
On this note, I have to give kudos to Gary Taubes at Discover magazine, who isn't buying any of this. Taubes explains that:
"Both of these studies were classic examples of what is known technically as observational epidemiology, a field of research I discussed at great length back in 2007 in the New York Times Magazine. I made the argument that this particular pursuit is closer to a pseudoscience than a real science."Taube looked at multiple observational studies done at Harvard's School of Public Health, by one of the top epidemiology groups in the country. But when observational studies "discovered" a causal relationship, it never seemed to pan out, says Taube:
"I pointed out that every time that these Harvard researchers had claimed that an association observed in their observational trials was a causal relationship—that food or drug X caused disease or health benefit Y—and that this supposed causal relationship had then been tested in experiment, the experiment had failed to confirm the causal interpretation—i.e., the folks from Harvard got it wrong. Not most times, but every time."A zero percent validation rate - it doesn't get any worse than that. Taube makes a very good case that the whole "nutritional epidemiology business" is pseudoscience. You just can't go around asking people what they eat and expect to get reliable answers.
So nope, it just isn't so, no matter how much we'd like to believe it. Eating chocolate isn't going to lower anyone's BMI. But the really important question is, which part of the chocolate Easter bunny do you eat first? I go for the ears.