Why do people cling on to beliefs, even when they’re shown incontrovertible proof that said beliefs are at least in part, or perhaps entirely, erroneous? This is a complex question that has no easy answers, but psychologists are giving it a shot.
A team from the Universities of Rochester and California, Berkeley have their own theory: people’s beliefs are more likely to be hardened by the feedback they get from others, rather than by anything purely logical or scientific.
If true, this has important implications. It suggests, for example, that climate change deniers aren’t convinced at all by hard data. Instead, assuming everything else is equal, they are influenced by how others react to their opinion.
For this Open Mind study, the researchers recruited 500 adults via the online Mechanical Turk crowdsourcing program. They then asked them to look through a collection of shapes of various hues onscreen and pick out which of them could be defined as a “Daxxy.”
Such a thing doesn’t exist, and no parameters were divulged, so it didn’t really matter what the participants responded. Nevertheless, the researchers gave them feedback as to whether they were “correct” or not after each choice. The participants also had to explain how confident they felt after each choice was made.
No matter where it happened during the sequence of 24 images, those that “correctly” identified a Daxxy a few times in a row reported being increasingly confident of their future choices. This meant that being told by someone they were doing well – regardless of the fact such feedback was meaningless – boosted certainty.
The implication here is that feedback and a self-assessment of their own behavior is a key (but not sole) factor in influencing the certainty of people’s beliefs. It’s easy to see why this is problematic.
Back to climate change deniers. If they decide to believe that climate change is a conspiracy, and the first few times they tell their friends this, or mention it in a like-minded forum, they will perhaps be told that their view is correct.
Based on this paper’s findings, this will harden their opinion, making them extremely rigid to change even when presented with compelling counter-arguments. Their genuine curiosity in that subject is greatly diminished.
As mentioned earlier, other studies have various thoughts on this subject. This is one of many examples, with each paper constructed differently, and with plenty of interpretation of the data making different conclusions – sometimes contradictory, sometimes corroborative compared to others.
At the very least, though, it’s becoming somewhat clear that objective evidence doesn’t have the effect it’s supposed to.
This study, for example, suggested that anti-vaxxers suffer from the Dunning-Kruger Effect, in that their own ignorance is unnoticeable to them the more ignorant of the facts they get. This one linked various biases to certain beliefs: conservatism was associated with climate change denial; moral purity concerns were linked to “moral purity concerns”; low science literacy and support was linked to a lack of support for GM crops.
Several papers famously found that people presented with evidence discounting their beliefs often become even more entrenched in them. Even when it comes to memories, confidence isn’t linked to whether or not the memory was real or not.
Finding out the root cause for something as complex as people’s deeply-held beliefs is far from easy. It must also be said that, with regards to this new study, 500 people isn’t a large sample size.
Either way, it adds to the pile of evidence indicating that humans are not idealized learning machines, while also asking a currently rhetorical question: What should we do about the (huge) problem of subjective certainty?