This morning I posted a somewhat humorous take on the whole “red meat increases death” debacle since I knew, through the methodology of the study, that it was likely to be incorrect. Since others had covered why it was wrong, I didn’t see the need to beat it up here. But I would like to share with you Gary Taubes’ take on the red meat study done by the Harvard School for Public Health.. He goes deeper into the process of the study in a way that I think is crucially important to understand. Science is no longer science. It’s being used by special interests to prove their desired outcome. The global warming or climate change debate is a great example of this. When those who believe (or want to believe) humans are causing climate change and say the evidence is incontrovertible, we’ve gone way beyond science into the realm of religion. Science by it’s very nature cannot be “incontrovertible”. Everything is a theory subject to being disproved at any second. Anything else is dogma.
So I’m going to post a few important points to take away from Gary’s post. Click here to read the full article.
This is an issue about science itself and the quality of research done in nutrition. Those of you who have read Good Calories, Bad Calories (The Diet Delusion in the UK) know that in the epilogue I make a point to say that I never used the word scientist to describe the people doing nutrition and obesity research, except in very rare and specific cases. Simply put, I don’t believe these people do science as it needs to be done; it would not be recognized as science by scientists in any functioning discipline.
Science is ultimately about establishing cause and effect. It’s not about guessing. You come up with a hypothesis — force x causes observation y — and then you do your best to prove that it’s wrong. If you can’t, you tentatively accept the possibility that your hypothesis was right. Peter Medawar, the Nobel Laureate immunologist, described this proving-it’s-wrong step as the ”the critical or rectifying episode in scientific reasoning.” Here’s Karl Popper saying the same thing: “The method of science is the method of bold conjectures and ingenious and severe attempts to refute them.” The bold conjectures, the hypotheses, making the observations that lead to your conjectures… that’s the easy part. The critical or rectifying episode, which is to say, the ingenious and severe attempts to refute your conjectures, is the hard part. Anyone can make a bold conjecture. (Here’s one: space aliens cause heart disease.) Making the observations and crafting them into a hypothesis is easy. Testing them ingeniously and severely to see if they’re right is the rest of the job — say 99 percent of the job of doing science, of being a scientist.
The problem with observational studies like those run by Willett and his colleagues is that they do none of this. That’s why it’s so frustrating. The hard part of science is left out and they skip straight to the endpoint, insisting that their interpretation of the association is the correct one and we should all change our diets accordingly.
One of the challenges to interpreting a study of this nature is determining what is really being compared. Are you comparing those who ate more steak than those who didn’t? Are you comparing people who are virtually vegetarians to those who are not? Are you comparing those who comply and those who do not? Gary continues:
If you go back and read my New York Times Magazine article on this research, you’ll see that I discussed a whole host of effects, known technically as confounders — they confound the interpretation of the association — that could explain associations between two variables but have nothing to do biologically with the variables themselves. One of these confounders is called the compliance or adherer effect. Heres’ what I said about it in the article:The Bias of Compliance
A still more subtle component of healthy-user bias has to be confronted. This is the compliance or adherer effect. Quite simply, people who comply with their doctors’ orders when given a prescription are different and healthier than people who don’t. This difference may be ultimately unquantifiable. The compliance effect is another plausible explanation for many of the beneficial associations that epidemiologists commonly report, which means this alone is a reason to wonder if much of what we hear about what constitutes a healthful diet and lifestyle is misconceived.
Bottom line: if you tend to be compliant when given instructions you will obviously benefit. You will also be healthier because you care about your health. Those who are less or non compliant will do more poorly since they will not be following recommendations.
It’s this compliance effect that makes these observational studies the equivalent of conventional wisdom-confirmation machines. Our public health authorities were doling out pretty much the same dietary advice in the 1970s and 1980s, when these observational studies were starting up, as they are now. The conventional health-conscious wisdom of the era had it that we should eat less fat and saturated fat, and so less red meat, which also was supposed to cause colon cancer, less processed meat (those damn nitrates) and more fruits and vegetables and whole grains, etc. And so the people who are studied in the cohorts could be divided into two groups: those who complied with this advice — the Girl Scouts, as Avorn put it — and those who didn’t.
Now when we’re looking at the subjects who avoided red meat and processed meat and comparing them to the subjects who ate them in quantity, we can think of it as effectively comparing the Girl Scouts to the non-Girl Scouts, the compliers to the conventional wisdom to the non-compliers. And the compliance effect tells us right there that we should see an association — that the Girl Scouts should appear to be healthier. Significantly healthier. Actually they should be even healthier than Willet et al. are now reporting, which suggests that there’s something else working against them (not eating enough red meat?). In other words, the people who avoided red meat and processed meats were the ones who fundamentally cared about their health and had the energy (and maybe the health) to act on it. And the people who ate a lot of red meat and processed meat in the 1980s and 1990s were the ones who didn’t.
What is the solution? Experiment! And we’ve been doing experiments for a long time with many diet variables:
So we do a randomized-controlled trial. Take as many people as we can afford, randomize them into two groups — one that eats a lot of red meat and bacon, one that eats a lot of vegetables and whole grains and pulses-and very little red meat and bacon — and see what happens. These experiments have effectively been done. They’re the trials that compare Atkins-like diets to other more conventional weight loss diets — AHA Step 1 diets, Mediterranean diets, Zone diets, Ornish diets, etc. These conventional weight loss diets tend to restrict meat consumption to different extents because they restrict fat and/or saturated fat consumption and meat has a lot of fat and saturated fat in it. Ornish’s diet is the extreme example. And when these experiments have been done, the meat-rich, bacon-rich Atkins diet almost invariably comes out ahead, not just in weight loss but also in heart disease and diabetes risk factors. I discuss this in detail in chapter 18 of Why We Get Fat, ”The Nature of a Healthy Diet.” The Stanford A TO Z Study is a good example of these experiments. Over the course of the experiment — two years in this case — the subjects randomized to the Atkins-like meat- and bacon-heavy diet were healthier. That’s what we want to know.
I highly recommend that you read the entirety of Gary Taubes’ view on the red meat study. This will help you differentiate real science from pseudo-science in the future when the other inevitable studies come along.