The World Health Organization (WHO) threw the cat among the pigeons last week with a new report linking eating red and processed meat to cancer.
It didn’t claim our way of life is killing us, but it would seem this way from the reactions. Agriculture Minister Barnaby Joyce, for instance, said the WHO would have humans living in caves were we to follow all its recommendations.
This response is all too familiar and highlights the public’s fundamental misunderstanding of how science works. Two issues stand in the way of, and often override, sensible interpretations of research findings – science fatigue and confirmation bias.
The media constantly bombards us with the latest research on a plethora of topics without much nuance on its quality or relevance.
Last year red wine was good, this year it’s bad. Last month lots of water was good, this month it’s bad. Today you need more protein, tomorrow you need more carbohydrates.
This apparent seesaw in health journalism creates science fatigue in the public mind. The underlying science for most of these reports is sound, but as a New England Journal of Medicine editorial suggests, the reporting is often irresponsible and out to click-bait an unsuspecting public:
A problem that is worsening in this era of the 24/7 news cycle is the frequent failure to put new developments into any kind of reasonable context for readers or viewers. In this environment, reporters become little more than headline readers or conduct interviews that amount to a “hit and run” version of journalism.
The constant hype leads to distrust and erodes the integrity of scientific research. How can science be trusted if it can’t make up its mind?
All too often the distinction between scientific opinion and fact is not clear. Effectively engaging the public in often specialised scientific findings is a work in progress and has been a challenge for the media, governments and science for some time.
A 2000 United Kingdom report into the country’s mad cow disease outbreak in the 1990s concluded that a government department had provided inappropriate technical advice about the link between contaminated beef and human health. It said the departments' communication had provoked an “irrational public scare”.
A barrage of similar instances has created a crying wolf scenario, particularly when journalists and public relations operators report certain studies as the final word. When the real wolf appears (like last week’s WHO meat evaluation) we brush it away as insignificant and continue our existing behaviours.
Recently a family friend pronounced that his grandmother smoked all her life and reached the ripe old age of 90, so he is not worried about his “moderate” smoking habit. His grandmother may have had the potential to reach 120 as a non-smoker, but numerous other variables could have influenced the final result for her.
All too often, we base important health decisions on personal anecdotal experience. The plural of anecdote is not data, yet we grasp at any straw that reinforces our own opinions so we can maintain our status quo. This is called confirmation bias.
In an extensive review of this phenomenon, American psychologist Raymond Nickerson contends it might in fact be the single most problematic aspect of human reasoning.
…once one has taken a position on an issue, one’s primary purpose becomes that of defending or justifying that position. This is to say that regardless of whether one’s treatment of evidence was evenhanded before the stand was taken, it can become highly biased afterward.
Numerous studies have explained confirmation bias as it applies to all kinds of fundamental situations. For instance, we tend to seek out sources of information likely to reinforce what we already believe in, and we interpret the evidence in ways that support what we already believe.
Even the pressure to publish can create a bias in scientists which influences the objectivity and integrity of research.
A review of publications and related biases by the British National Institute for Health Research found that studies with significant or favourable results were more likely to be published or cited that those with non-significant or unfavourable results.
When our meat eating – which is seen as such a fundamental part of our existence, our culture, our economy and maybe even our identity – is attacked, we resort to confirmation bias and often use personal anecdotes as a counter attack.
Certainly anecdotes in health care shouldn’t be ignored, but they need to be understood together with formal, research evidence.
Scientists Aren’t Exempt
The American Dietetic Association holds the position that meat is not required for a healthy diet. Yet we have heard many experts say otherwise. In some cases, this could be because it is part of the social fabric of our society, and scientists aren’t exempt from bias.
A recent study noted that when scientists were put in situations where they were expected to be an expert or see themselves as experts, they tended to over-estimate the accuracy of their own beliefs.
Even if these beliefs stem from a knowledge in their field, the tendency to cling to prior opinions increases the likelihood of bias.
Thankfully, once we are able to overcome our fatigue and biases, and reasonably consider the latest evidence, we can steer ourselves in a direction where the risk of cancer is lower without any knee-jerk reactions.
Daniel du Plooy, PhD Candidate in Social Psychology, La Trobe University
This article was originally published on The Conversation. Read the original article.