Communities and Social Media

How filter bubbles and echo chambers reinforce negative beliefs and spread misinformation through social media

Abstract

This paper examines the way that negative beliefs and misinformation are reinforced and spread on social media. Social media platforms provide an online space in which communities can connect and share content but the ease in which they allow filter bubbles and echo chambers is a concern. Filter bubbles are created through online processes that provide us with personalized results, recommendations and news feeds based on the things we have previously engaged with. This can lead to being stuck in a feedback loop that constantly reinforces our beliefs and interests, forming into echo chambers. Echo chambers can be harmless, but when combined with people with extreme views looking for reinforcement of negative beliefs and the filter bubble feedback loops, this can lead to disastrous real-world consequences.

Hashtags: #onlinecommunities, #socialmedia, #filterbubbles, #echochambers, #feedbackloops

How filter bubbles and echo chambers reinforce negative beliefs and spread misinformation through social media

New technologies have transformed communities from densely connected social ties based around locality to providing a wide online space where virtual communities can flourish. The introduction of Web 2.0 brought participation, dynamic content and metadata for a richer and more interactive online experience (Best, 2006). Social media platforms have been a big part of keeping these virtual communities growing and thriving with features built around sharing, reacting and discussing all kinds of content with all different kinds of people. It is easy to get lost in the wealth of information online, and with personalized filters providing us with easy ways to find things that we’re interested in and people who are similar to us, we are constantly connected and finding new connections. A downside to this is the possibility of people getting themselves stuck in a feedback loop of filter bubbles and echo chambers: constant reinforcements of their values, behaviours and interests. These concepts are not unique to the online world but the invisibility and secretiveness of the filtering processes and the ease in which we can fall into these online traps is a concern. While many people will experience filter bubbles and echo chambers during their time online, these things are most dangerous when it comes to vulnerable people and those with extremist views and polarizing opinions. Although online social media platforms offer a diverse range of opinions and information, individuals can become trapped in filter bubbles and self-selected echo chambers that reinforce negative beliefs and spread misinformation.

Defining the term ‘community’ is a hotly contested issue within the social sciences but can be thought to describe social groups that share locality, ethnicity, class, religion, politics, social ties, interests or other commonalities. No matter the shared commonality, the “core of community is the provision of social belonging” (Dotson, 2017, p. 33). Gerard Delanty (2018) suggests that the American and French revolution, industrialization and globalization were major upheavals leading to the main discourses around community. Before the industrial revolution, communities were based on “densely connected relations, organized around the home and small-town life” with a “high degree of conformity to similar beliefs, backgrounds, and activities” (Hampton & Wellman, 2018, p. 6). Industrialization increased migration and mobility, allowing people to escape their small “organically connected” social ties formed through “kinship, locality, and occupation” (Simmel, 1950, p. 404) and leaving them free to create new ties based around their personal interests (Hampton, 2016). With new transportation and communication technologies, “shared interest [began] to replace shared place as the dominant force in social tie formation” (Hampton, 2016, p. 107).

While traditional communities were creations of the limitations of mobility, evolving technology has allowed for a reshaping of people’s networks and behaviours into the contemporary digital social structure (Hampton & Wellman, 2018). The growth of social media has allowed the formation of a hybrid of the preindustrial and urban-industrial community structures through persistent contact and pervasive awareness (Hampton, 2016). Social ties that were once lost as an individual moved localities, schools or jobs now have the potential to “become enduring channels of communication” with the persistent contact afforded by social media (Hampton, 2016, p. 111). Social media profiles and email addresses create a permanent digital location for an individual to be contacted. Friends and follower lists “allow for people to sustain contact without substantially drawing from the time and resources required to maintain ties” through past methods of communication (Hampton & Wellman, 2018, p. 647). The ability to share text and photos through social media sites, or blogs provides a pervasive awareness of “activities, interests, location, opinions, resources and life course transitions of social ties” (Hampton, 2016, p. 111). Virtual communities are communities “mediated by a highly personalized technology” (Delanty, 2018, p. 204), bringing strangers together “in a sociality often based on anonymity” intertwining politics and subjectivity to form a new “intimacy” (Delanty, 2018, p. 205). Unlike the traditional “thick or organic communities”, these communities are thin and fragile communities made up of strangers with no strong ties (Delanty, 2018, p. 205). They can, however, strengthen existing “social and political realities” by linking individuals with similar beliefs, interests and tastes (Delanty, 2018, p. 215).

Online communities have a large presence on social media platforms which have many interactive features for helping communities stay connected. Facebook has reaction buttons, sharing and commenting features for posts, making sure that content that is the most engaged with is the most visible. Facebook groups also help like-minded people discuss and share content around specific topics. YouTube has a like and dislike button, comments, recommendations and prominently displayed view counts that make it easy to know what is popular. Twitter and Instagram rely heavily on hashtags to spread content. Hashtags function as a form of “searchable” talk that both invites an audience and labels the meanings they are trying to express (Zappavigna & Hyland, 2014, p. 11). Hashtags help to create a common language, which is a “key element of community formation” (Gruzd, Wellman & Takhteyev, 2011, p. 1301). Their use assumes there is a virtual community of interested people, making it possible for people to find others with similar interests and “intensifies a call to affiliate with these values” (Zappavigna & Hyland, 2014, p. 92).

Though the idea of filter bubbles is not a new phenomenon and they do not just occur online, the term was coined by Eli Pariser to describe the invisible and secretive filtering process by websites that “create[s] a unique universe of information” personalized to each of us (Pariser, 2011, p. 10). While these personalized filters are helpful for users to find and consume the information they want amongst the torrent of data available online, they can also “indoctrinate us with our own ideas, amplify our desire for things that are familiar and leave us oblivious to the dangers lurking in the dark territory of the unknown” (Pariser, 2011, p. 13). They provide us with a cozy online space where we are surrounded by all the things we like and believe in but provide less room for “chance encounters that bring insight and learning” from the “collision of ideas from different disciplines and cultures” (Pariser, 2011, p. 13). Though users “retain their own agency to make choices about searching, connecting, and engaging with others” (Bruns, 2019, p. 23) and filter bubbles may not generally be as dangerous as they sound, it is important to realise that they are formed invisibly and without our explicit consent. The algorithms that Google uses to personalize our search or that Facebook uses to organize our news feeds are secret: we are not told why we are seeing certain results or what is being filtered out for us—and we do not get to choose these criteria. This makes it easy for people to believe that what they are reading and seeing is “unbiased, objective and true” (Pariser, 2011, p. 10). Pariser (2011) gives an example of two similar women who lived in the same part of the US searching for “BP” on Google during the oil spill crisis: one woman saw search results regarding the crisis, but the other only received investment information. The algorithms in play make it easy to hide things from us even when it may not be in our best interest.

Personalized filters show us the things we are most likely to be interested in and therefore the things we are most likely to click on and share. They are designed to play on our compulsivity, showing us content that is attention-grabbing and stimulating. Consider that 70% of YouTube watch time comes from the recommendations (Solsman, 2018). Also consider that people are more likely to look at negative or divisive stories and videos over positive or neutral ones (Reis et al., 2015) and that humans in general tend to believe negative information over positive information (Morwedge, 2009). Former Google design ethicist, Tristan Harris, says that this behaviour can lead to things such as teenage girls watching a dieting video being recommended videos on anorexia, or someone watching World War II videos being recommended holocaust denial videos (JRE Clips, 2020). Similar behaviour occurs on Facebook, such as a new mother joining a Facebook group for organic baby food and finding anti-vaccine groups for mothers among her recommended groups (JRE, 2020).These videos and groups are being recommended simply because the recommendation algorithm knows that people like you spent more time watching these videos or spending time in these groups based around divisive and ‘extreme’ topics. With most social media sites lacking the ability to provide meaningful negative feedback, content that is presented with “extreme sentiment” will be the most shared and viewed content regardless of how truthful or dangerous it may be. As people are most likely to be sharing this content with friends, followers or in groups with similar ideological views (Barberá, 2020), reinforcement of extreme views in these communities can lead to the formation of echo chambers.

Long before the internet, the traditional community structure—with its densely connected networks and rigid hierarchies—easily allowed echo chambers to form (Hampton & Wellman, 2018). The mobility gained by industrialization provided an escape from these echo chambers but the digital age has given rise to self-selected online echo chambers. An echo chamber is an “environment in which the opinion, political leaning, or belief of users about a topic gets reinforced due to repeated interactions” with those who share their attitudes (Cinelli et al., 2021, p. 1). The formation of echo chambers on social media can be blamed on human tendencies such as selective exposure, confirmation bias and group polarization (Cinelli, et al., 2021) and are further accelerated by filtering algorithms (Barberá, 2020). While the online community can offer a wide range of information and opinions, the exposure to dissonant information can “reduce perceived homophily, increase cognitive dissonance, and silence democratic debate” (Hampton & Wellman, 2018, p. 648). Disconfirmation bias has also been shown to occur, where people will counter argue and degrade opposing arguments, leading to a stronger belief in the opinion they already held (Karlsen et al., 2017). Online or otherwise, people “prefer information adhering to their worldviews, ignore dissenting information and form polarized groups around shared narratives” (Cinelli, et al., 2021, p. 5).

The social media features that allow us to find like-minded people so easily “can facilitate and strengthen fringe communities” and “further polarize their views or even ignite calls-to-action” (van Alstyne & Brynjolfsson et al., 1996, p. 5). This creates a “breeding ground for extremism” (Sunstein, 2001, p. 71) and is likely the reason behind the rise of hate groups such as white supremacists, holocaust deniers and incels (Barberá, 2020). A group overwhelmed by filter bubbles can lead to groupthink, “a temporary loss of the ability to think in a rational, moral and realistic manner”, and can lead members to regard “those outside the group as enemies, censor opposing ideas, and pressure members to conform” to their beliefs (Farnam Street, 2017, para. 55). Misinformation spreads further and more quickly as polarization increases (Cinelli et al., 2021) and this leads to the formation of groups such as the anti-vaccine and flat Earth communities. While these communities may seem relatively harmless due to their online existence, they have real-world implications: those subscribing to the incel ideology have caused the murders of at least forty-four people since 2014, with one man killing ten people before posting on Facebook, “The Incel Rebellion has begun!” (Tolentino, 2018, para. 7), and anti-vaccine groups been linked to reduced vaccination rates and a resurgence of measles and mumps (Benecke & DeYoung, 2019). Unlike filter bubbles, which can occur without any contribution from us, online echo chambers are self-selected by people generally looking for support or reinforcement of their beliefs rather than trying to find challenging information. Real life presents us with interactions that “force us to deal with diversity” (Putnam, 2000, p. 178) and the inability to just block people who are saying things you may not want to hear, but the online world gives us more freedom to block out the information we do not want to hear. If the information people want to hear is reinforcement of hate, negativity and misinformation then it can become a serious problem even for those not involved in those particular communities.

Social media platforms provide an interactive and personalized experience for online users to make it easy to connect with content and people whose values they share but leaves people at risk of becoming stuck in filter bubbles and echo chambers, particularly ones that reinforce negative opinions, behaviour and information. The evolution of traditional communities to online communities has given us platforms and opportunities to connect with, react to and share a constantly updated stream of information and build new communities that could have never existed offline. Personalized filters that provide us with recommendations based on content that we engage with, and what people like us engaged with, give us a unique and narrowed down section of the web, but can sometimes prevent us from connecting with ideas that may challenge or inform us. Echo chambers are formed when people find themselves in a system of constantly reinforced beliefs that block out opposing views and these are of serious concern when they involve extremism and political polarization. These things existed well before the internet and social media, but the online world makes it so easy to become blind and complacent. It becomes particularly dangerous when those with polarizing and extremist views are given this online soapbox where they connect with others that reinforce and spread their ideologies. Whether these things make us lazy and unchallenged or give us the opportunity to spread hate and misinformation is a concern for all of us who spend time online. While we cannot possibly escape from filter bubbles and echo chambers entirely, it is essential that we remain aware and try to expose ourselves to new views and ideas as much as we can.

References

Barberá, P. (2020). Social Media, Echo Chambers, and Political Polarization. In N. Persily & J. Tucker (Eds.), Social Media and Democracy: The State of the Field, Prospects for Reform (pp. 34-55). Cambridge: Cambridge University Press.

Benecke, O., & DeYoung, S. E. (2019). Anti-Vaccine Decision-Making and Measles Resurgence in the United States. Global pediatric health, 6, 2333794X19862949. https://doi.org/10.1177/2333794X19862949.

Bruns, A. (2019). Are filter bubbles real?. Polity Press.

Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences of the United States of America, 118(9). http://dx.doi.org.dbgw.lis.curtin.edu.au/10.1073/pnas.2023301118.

Delanty, G. (2018). Community. Routledge.

Dotson, T. (2017). Technically Together. The MIT Press.

Farnam Street. 2017. How Filter Bubbles Distort Reality: Everything You Need to Know. https://fs.blog/2017/07/filter-bubbles/.

Gruzd, A., Wellman, B., & Takhteyev, Y. (2011). Imagining Twitter as an Imagined Community. American Behavioral Scientist, 55(10), 1294–1318. https://doi.org/10.1177/0002764211409378.

Hampton, K. N. (2016). Persistent and Pervasive Community: New Communication Technologies and the Future of Community. American Behavioral Scientist, 60(1), 101–124. https://doi.org/10.1177/0002764215601714.

Hampton, K. N., & Wellman, B. (2018). Lost and Saved . . . Again: The Moral Panic about the Loss of Community Takes Hold of Social Media. Contemporary Sociology, 47(6), 643–651. https://doi.org/10.1177/0094306118805415.

  Karlsen, R., Steen-Johnsen, K., Wollebæk, D., & Enjolras, B. (2017). Echo chamber and trench warfare dynamics in online debates. European Journal of Communication, 32(3), 257–273. https://doi.org/10.1177/0267323117695734.

Putnam, R. 2000. Bowling alone: America’s declining social capital. Simon and
Schuster.

Reis, J., Benevenuto, F., Vaz de Melo, P., Prates, R., Kwak, H., & An, J. (2015). Breaking the news: First impressions matter on online news. Ithaca: Cornell University Library, arXiv.org. https://link.library.curtin.edu.au/gw?url=https://www.proquest.com/working-papers/breaking-news-first-impressions-matter-on-online/docview/2081799886/se-2?accountid=10382.

Simmel, G. (1950). The Sociology of George Simmel. Free Press.

Solsman, J. (2018, January 10). YouTube’s AI is the puppet master over most of what you watch. CNET. https://www.cnet.com/news/youtube-ces-2018-neal-mohan/.

Sunstein, C. R. 2001. Republic.com. Princeton University Press.

Alstyne, M.W., & Brynjolfsson, E. (1996). Electronic Communities : Global Village or Cyberbalkans?. http://citeseerx.ist.psu.edu/viewdoc/similar?doi=10.1.1.72.3676&type=ab.

Zappavigna, M., & Hyland, K. (2014). Discourse of twitter and social media : How we use language to create affiliation on the web. Bloomsbury Publishing Plc. https://ebookcentral.proquest.com.

3 thoughts on “How filter bubbles and echo chambers reinforce negative beliefs and spread misinformation through social media

  1. Hi Lauren,

    Great paper. I liked that you acknowledged both the benefits and downfalls of online communities in the beginning part of your paper. You have explained the two concepts; filter bubble and echo chambers very thoroughly.

    I would love to know more about an example of the harmful outcomes communities created through the effects of echo chambers and/or filter bubbles have brought about. I would also love to know your thoughts about whether you think the formation of echo chambers and filter bubbles could be avoided on social media?

    I have spoken about the spread of misinformation in my paper and briefly mention echo chambers so it was interesting to read your more in depth study of the concept. This is the link to my paper if you want to give it a read: https://networkconference.netstudies.org/2021/2021/04/26/tiktok-influencers-spreading-bad-health-habits-and-promoting-a-starving-gen-z/

    Once again, well done on your paper.

  2. Hello Lauren,

    Echo chambers are certainly an interesting yet disturbing revelation into the omnipresent utilisation of social media platforms. Your paper is very informative about how echo chambers are constructed and their destructive qualities to those who become intrenched in a feedback loop of self-accepting ideological narratives. Echo chambers are definitely a hidden consequence enabled by social media’s business to maintain their user’s attention. Social media’s pervasive collection of personal data dictates the curation of similar ideological content which is designed to keep users infatuated with using the sites. As this personal data is valuable to a considerable number of stakeholders, there is no surprised when sites, such as Facebook, employ tactics to try and maintain user’s attention for as long as possible to maximise advertisement revenue. It seems like the major social media platforms are utilizing echo-chambers for their own revenue generating ambitions. Do you believe that a type of social media platform could exist which does not inflict users into a type of self-induced echo-chamber, or are echo chambers unavoidable in today’s digitally intimate society?

    At the start of the paper, you state that echo chambers are “most dangerous when it comes to vulnerable people and those with extremist views and polarizing opinions.” In my conference paper I also discuss the construction of social media induced echo chambers (https://networkconference.netstudies.org/2021/2021/04/25/facebooks-segregation-of-communities-is-fueling-the-destruction-of-democracy/). From my understanding, it seems that the effects of echo chambers are widespread throughout a diverse range of users, and its “most dangerous” consequences are, rather than implemented on a certain type of person, but the ramifications it has on the operation of dispensing information and its end result on the functioning of democracy. I certainly acknowledge that echo-chambers have the tendency to polarize individual users leading to extremism or radicalisation with their already accepted ideological narratives. But feel like most people who are exposed to said echo chamber would be affected in one way or another (influencing worldview, or political understanding). As Castells (2007) explains, “What does not exist in the media doesn’t exist in the public mind.” Therefore I believe echo chambers have a widespread effect on all users (and democracy), and not just a certain type of vulnerable person.

    Castells, M. (2007). Communication, power, and counter-power in the network. International Journal of Communication, 1, 238-266. https://ijoc.org/index.php/ijoc/article/view/46

  3. Thanks for your paper Lauren.

    I find the concept of echo chambers fascinating, especially when discussing them in the context of online interaction. Like you mention, these have always existed but been essentially limited in their fruition due to geographical reasons. You’re discussions on the human nature behind these echo chambers is also really intriguing. Like you say, we’re essentially bound to prefer them as the safety of our own ideals and world views is a lot easier to accept than those of others.

    I discuss this a bit in my paper but in more of a political sense as these echo chambers can create an incentive to rise to political action without consideration for opposing views.

    Thanks again for your paper, very much enjoyed reading it.

Leave a Reply

Your email address will not be published. Required fields are marked *