Recently by Mark Sisson: I Am a Strong, Happy, Healthy, Life-Loving Man
This is another special guest post from our favorite study-dismantler, Denise Minger. Read all of her previous Mark’s Daily Apple articles here, here, here and here, pay her website a visit, and stay tuned for her upcoming book “Death by Food Pyramid” due out later this year.
We're already 74 days into the new year, which can only mean one thing: it's high time for our latest episode of Science Says Meat Will Kill You, complete with a brand new study and commercial-free viral media coverage! Have a seat and tune in (or at least set your DVR for later viewing).
If you haven't had at least one family member, coworker, or soon-to-be-unfriended Facebook acquaintance send you this study as a reminder that you're killing yourself, you're either really lucky or your inbox is broken. Thanks to an observational study called Red Meat Consumption and Mortality freshly pressed in the Archives of Internal Medicine, a slew of bold headlines exploded across every conceivable media outlet this week:
- u201CAll red meat is bad for you, new study saysu201D
- u201CRed meat is blamed for one in 10 early deathsu201D
- u201CScientists warn u2018red meat can be lethal'u201D
Media sensationalism aside, the study does seem to spell trouble for proud omnivores. Unlike some similar publications we've seen on meat and mortality, this one says that red meat doesn't just make you die of heart disease and cancer; it makes you die of everything. Following over 120,000 women and men from the Nurses' Health Study and the Health Professional's Follow-up Study for 28 and 22 years respectively, researchers found that a single daily serving of unprocessed red meat was associated with a 13% increased risk of death from all causes, while a single serving of processed red meat – the equivalent of one hotdog – was associated with a 20% increased risk.
And in case that's not enough to chew on, there's more: the researchers waved their statistical wands and declared you could outrun death for a few more years by swapping red meat for so-called u201Chealthier foodsu201D like nuts, chicken, or whole grains. In fact, the researchers suggest that up to one in ten of the deaths that struck their study participants could've been prevented if everyone had kept their red meat intake under half a serving per day!
But if you've been hanging around the nutrition world for very long, you've probably realized by now that health according to the media and health according to reality are two very different things – and even scientific studies can be misrepresented by the researchers who conduct them. Is our latest u201Ckiller meatu201D scare a convincing reason to ditch red meat? Is it time to put a trigger lock on your lethal grass-fed beef when the young'uns are around? Or is there more to this story than meats the eye? (Sorry, I had to.)
Observations vs. Experiments
Before we even dig into what this study found, let's address an important caveat that the media – and even the researchers, unless they were terribly misquoted – seem to be confused about. What we've got here is a garden-variety observational study, not an actual experiment where people change something specific they're doing and thus make it possible to determine cause and effect. Observations are only the first step of the scientific method – a good place to start, but never the place to end. These studies don't exist to generate health advice, but to spark hypotheses that can be tested and replicated in a controlled setting so we can figure out what's really going on. Trying to find u201Cproofu201D in an observational study is like trying to make a penguin lactate. It just ain't happening… ever.
Nonetheless, the media blurbs – and even quotes from the scientists themselves – suggest this study has a major case of mistaken identity. The lead researcher Frank Hu claimed the study u201Cprovides clear evidence that regular consumption of red meat, especially processed meat, contributes substantially to premature death,u201D despite the fact that the study is innately incapable of providing such evidence. It's as if someone pulled a Campbell on us. Only an actual experiment, with controls and manipulated variables, could start confirming causation.
But the study's over-extrapolation isn't really that surprising. A conclusive experiment is what every observational study secretly yearns to be, deep down in its confounder-riddled, non-randomized heart. And like pushy stage mothers, some researchers want their observational studies to be more talented and remarkable than they truly are – leading to the scientific equivalent of a four year old wobbling around in stilettos at a beauty pageant. Our study at hand is a perfectly decent piece of observational literature, but as soon as its authors (or the media) smear it with lipstick and make it sing Patsy Cline songs on stage, it's all downhill from there.
Food Frequency Questionnaires: A Test of Superhuman Memory and Saint-like Honesty
To kick this analysis off, let's take a look at how the study was actually conducted. As the researchers explain, all of the diet data came from a series of food frequency questionnaires (FFQs) that the study participants filled out once every four years, starting in the 1980s and ending in 2006. (If you're feeling brave, you can read the questionnaire yourself (PDF) and try imagining how terribly the average, non-diet-conscious person might botch their responses.) The lifestyle and medical data came from additional questionnaires administered every two years.
The full text of our study offers some additional details (emphasis mine):
In each FFQ, we asked the participants how often, on average, they consumed each food of a standard portion size. There were 9 possible responses, ranging from u201Cnever or less than once per monthu201D to u201C6 or more times per day.u201D Questionnaire items about unprocessed red meat consumption included u201Cbeef, pork, or lamb as main dishu201D (pork was queried separately beginning in 1990), u201Chamburger,u201D and u201Cbeef, pork, or lamb as a sandwich or mixed dish.u201D … Processed red meat included u201Cbaconu201D (2 slices, 13 g), u201Chot dogsu201D (one, 45 g), and u201Csausage, salami, bologna, and other processed red meatsu201D (1 piece, 28 g).
Notice that one of the foods listed under u201Cunprocessed red meatu201D – and likely a major contributor to that category – is hamburger, the stuff fast-food dreams are made of. Although this study tracked whole grain intake, it didn't track refined grain intake, so we know right away we can't totally account for the white-flour buns wrapped around those burgers (or many of the other barely-qualifying-as-food components of a McDonald's meal). And unless these cohorts were chock full of folks who deliberately sought out decent organic meat, it's also worth noting that the unprocessed ground beef they were eating probably contained that delightful ammonia-treated pink slime that's had conventional meat consumers in an uproar lately.
Next, we arrive at this little gem:
The reproducibility and validity of these FFQs have been described in detail elsewhere.
Ding ding, Important Thing alert! As anyone who's spent much time on earth should know, expecting people to be honest about what they eat is like expecting one of those u201CLose 10 pounds of belly fatu201D banners to take you somewhere other than popup-ad purgatory: the idealism is sweet and all, but reality has other plans.
And so it is with food frequency questionnaires. Ever since these questionnaires were first birthed unto the world, scientists have lamented their most glaring flaw: people tend to report what they think they should be eating instead of what actually goes into their mouth. And that's on top of the fact that most folks can barely remember what they ate yesterday, much less what they've eaten over the past month or even the past year.
As a result, researchers compare the results of food frequency questionnaires with more accurate u201Cdiet recordsu201D – where folks meticulously weigh and record everything they eat for a straight week or two – to see how the data matches up. If we follow that last quote to the links it references, we end up at one of the validation reports for the food frequency questionnaire used with the Health Professionals Follow-up Study. Here's where it gets interesting:
Foods underestimated by the FFQs compared with the diet records (ie, the gold standard) included processed meats, eggs, butter, high-fat dairy products, mayonnaise and creamy salad dressings, refined grains, and sweets and desserts, whereas most of the vegetable and fruit groups, nuts, high-energy and low-energy drinks, and condiments were overestimated by the FFQs.
This shouldn't come as a shocker if we consider human psychology. Unless we literally live in a cave, most of us are constantly inundated with messages about how high-fat dairy, meat, sweets, desserts, and anything delicious and creamy is going to either make us fat or give us a heart attack – while it's more like hallowed be thy name for fruits and veggies. Is it any wonder that folks tend to under-report their intake of u201Cbadu201D foods and over-report their intake of the good ones? Who wants to admit – in the terrifying permanency of a food questionnaire – that yes, they do bury their salad in half a cup of Hidden Valley Ranch, and they do choose white bread because 12-Grain Oroweat tastes like lightly sweetened wood chippings, and sometimes they even go a full three days where their only vegetable is ketchup? If food frequency questionnaires were hooked up to a polygraph, we might see some much different data (and some mysteriously disappearing respondents).
Another reference in our study du jour takes us to a validation report for the Nurses' Health Study questionnaire. And here we find the same trend:
Mean daily amounts of each food calculated by the questionnaire and by the dietary record were also compared; the observed differences suggested that responses to the questionnaire tended to over-represent socially desirable foods.
Of course, if everyone over-reported or under-reported their food intake with the same magnitude of inaccuracy, we could still find some reliable associations between food questionnaires and health outcomes. But it turns out that how much someone fudges their food reporting – especially for specific menu items – varies wildly based on their personal characteristics. Using an Aussie-modified version of the Nurses' Health Study questionnaire, a study from Australia measured how accurately people reported their food intake based on their gender, age, medical status, BMI, occupation, school-leaving age, and use of dietary supplements. Like with the other validation studies, it compared the results of the food frequency survey with the Almighty Weighed Food Record.
The surprising results? Folks with a u201Cdiagnosed medical conditionu201D – including high cholesterol, high triglycerides, diabetes, high blood pressure, stroke, cancer, and heart disease – were much more likely to mis-report their meat consumption than folks without a diagnosed medical condition, generally overestimating their true intake on food frequency questionnaires compared to the weighed food record. Why this occurred is one of life's great mysteries, but it might have something to do with the fact that people who develop diet- and lifestyle-related diseases pay less conscious attention to what they eat. (In this study, women were also more likely to inaccurately report their intake for a wide variety of foods – a phenomenon that's been examined in greater depth by other researchers.)
So what does this mean for studies based on food frequency questionnaires, like the one currently hijacking the news outlets? Unfortunately for lovers of scientific accuracy, it means that meat consumption and modern diseases might be statistically more likely to show up hand-in-hand by mere fluke. If sick folks have a tendency – for whatever reason – to say they're eating more meat than they really are, that'll have profound effects on any diet-disease associations that turn up in observational studies, where correlations hinge so heavily on the accuracy of the data. And if the results of that Australian study are applicable not only in the Land Down Under but also in the Land Up Over, it could mean that meat is pretty much doomed to look guilty by association with disease whenever food frequency questionnaires are involved. Woe is meat!