A team of researchers have conducted a five-year-long study on a wide range of Facebook users in a quest to find out how misinformation blossoms online. In their paper, published in the Proceedings of the National Academy of Sciences, they note that it may be due to the nature of so-called “echo chambers,” spaces that allow people to amplify their own belief systems without obstruction.
In this sense, echo chambers describe certain areas of the media, particularly the Internet, wherein information or beliefs are reinforced by repetitive transmission inside an enclosed virtual space. These spaces, which also serve to keep contrasting views at bay, may explain why there are so many groups of people online – particularly on Facebook – that steadfastly believe information that is demonstrably nonsensical.
In order to investigate how effective these echo chambers were, 67 public Facebook pages – 32 regarding conspiracy theories and 35 related to science news – were comprehensively analyzed each and every time a post appeared, including how the followers interacted with it, from 2010 to 2014. Conspiracy theory sites include those that reject the overwhelming consensus on contemporary climate change, and those that believe that Jade Helm 15 – a series of military training exercises that occurred across the U.S. last year – were actually interpreted as signs of an impending civil war.
Echo chambers may explain how some people thought routine military exercises last year represented the beginning of a civil war. Przemek Tokar/Shutterstock
A third group consisting of two trolling pages, those that intentionally disseminate sarcastic, false information for potentially humorous effect, was also taken into account. These trolling websites acted as the experiment’s control group.
The researchers found that the way posts are initially distributed are the same for the science and conspiracy theory posts. Within the first two hours, and again after 20 hours of being posted, a post is shared most frequently, regardless of topic or validity – mostly with those that agree with their views.
However, a difference is noted in the long term. Science news is spread relatively quickly across the web, before sharing and discussion of the post drops off. Conversely, conspiracy theories build momentum more slowly before being shared and discussed increasingly for a longer period of time. This also means that conspiracy theories that gradually gain traction can eventually persist online, regardless of their limited factual basis.
Most significantly, however, is that the long-term online behavior of any type of group user both constructs and strengthens their own echo chambers. Individual people, publications or news organizations whose posts you click on or comment on more frequently will appear in your News Feed more often as a result; those you ignore will fade into near-complete obscurity.
This in itself is an echo chamber, one where the information fed back to you is reinforced by your online interactions. Eventually, therefore, a user’s Facebook space may exclusively include information that they believe in, and people that only agree with them.
A claim, whether it is substantiated or not, is given credence in the mind of an individual if the surrounding society deems it acceptable. This is known as confirmation bias, and this study shows that the phenomenon is just as prevalent in online communities as it is in physical ones. In the case of misinformation, this is incredibly dangerous – so much so that the World Economic Forum has declared its online spread, a form of “digital wildfire,” one of the main threats to global society.