It should go without saying that science should dictate how we respond to science denial. So what does scientific research tell us?
One effective way to reduce the influence of science denial is through “inoculation”: you can build resistance to misinformation by exposing people to a weak form of the misinformation.
How do we practically achieve that? There are two key elements to refuting misinformation. The first half of a debunking is offering a factual alternative. To understand what I mean by this, you need to understand what happens in a person’s mind when you correct a misconception.
People build mental models of how the world works, where all the different parts of the model fit together like cogs. Imagine one of those cogs is a myth. When you explain that the myth is false, you pluck out that cog, leaving a gap in their mental model.
Debunking myths creates gaps in people’s mental models. That gap needs to be filled with an alternative fact. John Cook, Author provided
But people feel uncomfortable with an incomplete model. They want to feel as if they know what’s going on. So if you create a gap, you need to fill the gap with an alternative fact.
For example, it’s not enough to just provide evidence that a suspect in a murder trial is innocent. To prove them innocent – at least in people’s minds – you need to provide an alternative suspect.
However, it’s not enough to simply explain the facts. The golden rule of debunking, from the book Made To Stick, by Chip and Dan Heath, is to fight sticky myths with even stickier facts. So you need to make your science sticky, meaning simple, concrete messages that grab attention and stick in the memory.
How do you make science sticky? Chip and Dan Heath suggest the acronym SUCCES to summarise the characteristics of sticky science:
Simple: To paraphrase a quote from Nobel prize winner Ernest Rutherford: if you can’t explain your physics simply, it’s probably not very good physics.
Unexpected: If your science is counter-intuitive, embrace it! Use the unexpectedness to take people by surprise.
Credible: Ideally, source your information from the most credible source of information available: peer-reviewed scientific research.
Concrete: One of the most powerful tools to make abstract science concrete is analogies or metaphors.
Emotional: Scientists are trained to remove emotion from their science. However, even scientists are human and it can be quite powerful when we express our passion for science or communicate how our results affect us personally.
Stories: Shape your science into a compelling narrative.
Let’s say you’ve put in the hard yards and shaped your science into a simple, concrete, sticky message. Congratulations, you’re halfway there! As well as explaining why the facts are right, you also need to explain why the myth is wrong. But there’s a psychological danger to be wary of when refuting misinformation.
When you mention a myth, you make people more familiar with it. But the more familiar people are with a piece of information, the more likely they are to think it’s true. This means you risk a “familiarity backfire effect”, reinforcing the myth in people’s minds.
There are several simple techniques to avoid the familiarity backfire effect. First, put the emphasis on the facts rather than the myth. Lead with the science you wish to communicate rather than the myth. Unfortunately, most debunking articles take the worst possible approach: repeat the myth in the headline.
Second, provide an explicit warning before mentioning the myth. This puts people cognitively on guard so they’re less likely to be influenced by the myth. An explicit warning can be as simple as “A common myth is…”.
Third, explain the fallacy that the myth uses to distort the facts. This gives people the ability to reconcile the facts with the myth. A useful framework for identifying fallacies is the five characteristics of science denial (which includes a number of characteristics, particularly under logical fallacies):
Five characteristics of science denial. John Cook
Pulling this all together, if you debunk misinformation with an article, presentation or even in casual conversation, try to lead with a sticky fact. Before you mention the myth, warn people that you’re about to mention a myth. Then explain the fallacy that the myth uses to distort the facts.
Putting Into Practice
Let me give an example of this debunking technique in action. Say someone says to you that global warming is a myth. Here’s how you might respond:
97% of climate scientists agree that humans are causing global warming. This has been found in a number of studies, using independent methods. A 2009 survey conducted by the University of Illinois found that among actively publishing climate scientists, 97.4% agreed that human activity was increasing global temperatures. A 2010 study from Princeton University analysed public statements about climate change and found that among scientists who had published peer-reviewed research about climate change, 97.5% agreed with the consensus.
I was part of a team that in 2013 found that among relevant climate papers published over 21 years, 97.1% affirmed human-caused global warming.
However, one myth argues that there is no scientific consensus on climate change, citing a petition of 31,000 dissenting scientists. This myth uses the technique of fake experts: 99.9% of those 31,000 scientists are not climate scientists. The qualification to be listed in the petition is a science degree, so that the list includes computer scientists, engineers and medical scientists, but very few with actual expertise in climate science.
And there you have it.
In our online course, Making Sense of Climate Science Denial, we debunk 50 of the most common myths about climate change. Each lecture adopts the Fact-Myth-Fallacy structure where we first explain the science, then introduce the myth then explain the fallacy that the myth uses.
In our sixth week on the psychology of debunking, we also stress the importance of an evidence-based approach to science communication itself. It would be most ironic, after all, if we were to ignore the science in our response to science denial.