Saying something over and over again doesn’t make it true. Or does it? A psychological theory known as the “illusory truth effect” claims that people tend to believe information more after repeated exposure. The more and more they hear it, the truer it feels. 

One such claim, that “red meat is bad for your health,” has been a steady drumbeat throughout the health and wellness industry for decades. However, a systematic review in Nature Medicine points out several weaknesses in the research supporting this claim. 

The review was performed by a team of researchers from the Institute for Health Metrics and Evaluation (IHME) at the University of Washington’s School of Medicine. In it, they evaluated the strength of evidence for 180 pairs of behaviors and negative health outcomes, including smoking and lung cancer, low vegetable diets and type 2 diabetes, and, consuming red meat and several potential health harms. 

In particular, the review examined the relationship between eating unprocessed red meat and breast cancer, colorectal cancer, type 2 diabetes, ischemic heart disease, ischemic stroke, and hemorrhagic stroke. Its findings contradict the red meat warning we’re used to hearing:

“We found weak evidence of association between unprocessed red meat consumption and colorectal cancer, breast cancer, type 2 diabetes and ischemic heart disease. Moreover, we found no evidence of an association between unprocessed red meat and ischemic stroke or hemorrhagic stroke.”

How could that be? We as consumers are told to limit our red meat intake over and over again by reputable institutions like the World Health Organization, the World Cancer Research Fund, and the U.S. Departments of Health and Human Services and Agriculture. Many of us are given similar marching orders from our primary care doctors and specialists. It turns out, the studies that serve as the foundation of these warnings have significant limitations and flaws. 

The studies are mostly observational 

While observational studies can provide helpful insights, these types of studies are very limited in the ability to prove one thing causes another. They are often filled with various assumptions and confounding variables that can distort the supposed cause-and-effect relationship.

When trying to find a causal link between two variables, the best option is a randomized clinical trial in which participants are randomly assigned to a control or experimental group (and aren’t told which one they are in). For ethical reasons, that is not possible to do here. 

The studies often involved self-reporting

Many of these studies relied on participants to report on their own eating habits, which is rarely done accurately (or honestly). Do you remember what you ate for dinner last week? When you ordered steak and eggs at the diner on Sunday, did you really bring the food scale with you? Or did you eyeball it? People cut corners, and our memory is not as good as we think it is. 

And who among us is prepared to disclose our secret “midnight ice cream” habit to a research team? 

Meta-analyses combine results without accounting for key differences between studies

Meta-analyses are used to pool together results from several individual studies. However, study designs and protocols can vary, which leads to differences in results between studies, known as “between-study heterogeneity.” When results are pooled together anyway without accounting for these differences, it can leave meta-analyses vulnerable to bias and errors. Importantly, one of the key findings of this review was significant between-study heterogeneity for all six pairs of behaviors and health outcomes that were evaluated.

In the case of red meat, the researchers also found that several of the meta-analyses combined results across studies with the assumption of log-linearity, which would mean any hazards of eating a fixed amount of red meat would increase equally across all levels of consumption. For example, increasing your red meat consumption from zero to 4 ounces per week would be assumed to have the same effect as increasing your red meat consumption from 8 to 12 ounces per week. 

However, as the researchers point out in their findings, current evidence does not support the assumption of log-linearity. The magnitude of response for many risk factors gets smaller at higher doses, rather than remaining constant. So while we don’t have a lot of information right now on the shape of the risk curves for red meat and various health outcomes, this evidence would seem to doubt a log-linear relationship. 

At the very least, the researchers do not believe there is evidence to claim right now with any measure of confidence that a log-linear function is the best way to model the health effects of eating red meat. 

More research is needed to prove causation

The researchers concluded that more research is needed to clarify the link between several risk factors and health outcomes, especially with red meat:

“While there is some evidence that eating unprocessed red meat is associated with increased risk of disease incidence and mortality, it is weak and insufficient to make stronger or more conclusive recommendations. More rigorous, well-powered research is needed to better understand and quantify the relationship between consumption of unprocessed red meat and chronic disease.”

The IHME scientists created their own five-star rating system to evaluate the strength of causal claims between behaviors and health outcomes. Perhaps this can be used as a guide for future researchers who are looking to fill knowledge gaps with stronger studies. After all, repeating a scientific claim over and over doesn’t make it true—repeating the experiment does.