A new study has found that you create your own echo chamber, not algorithms from Google Search. The research found that individuals actively sought out information that fits their beliefs, not search engine algorithms, bursting the idea that Google is actively exacerbating a specific belief system.
Since the rapid emergence of social media and search engines that collect user data, there have been concerns that algorithms are placing us into “echo chambers”, where users' political leanings of either direction are amplified by technology only showing content that affirms them. These echo chambers supposedly downrank content that challenges our worldview and promotes content that agrees with it, forcing each person more and more toward the extreme ends of the scale.
But is it actually tech companies doing it, or do we do it to ourselves? The internet provides us with any information we wish to see, so are we simply seeking out the content that is agreeable?
To identify which one it is, a team from Stanford University conducted a two-wave study over two US election cycles on the browsing habits of participants. The first wave in 2018 had 262-333 people, and the second had a large sample of 459-688 people in 2020. They each installed a custom browser extension that tracked what URLs were shown to them by Google, what URLs they interacted with, and what other people were interacting with at the time.
Once collected, the researchers looked to see whether the participants were being pushed partisan content, or if they were seeking it out.
In both waves, the results showed that people were interacting with more partisan URLs than the amount being shown to them by Google. They concluded that Google search was not driving the echo chambers, but rather user choice, with people seeking out far more URLs than presented by Google. They also found that the information was generally of better quality when found from Google search compared to other sources that the participants found independently, suggesting it may be doing more good than bad.
While the study may release Google from blame, it doesn’t necessarily mean all echo chambers are entirely user-created. People find sources from social media, news tabs, and many other places that may be politically biased, so it is still possible that the person isn’t entirely to blame. However, it does seem likely that people are more likely to search for content that agrees with their beliefs – which is something we probably already knew.
The study is published in the journal Nature.