Social Networks

The Societal Effects of Online Echo Chambers

Social media platforms have created a new avenue for expression and also education. The revolutionary communication technology has permeated every aspect of the digitally connected world. Allowing for individuals to connect, create and communicate with others from anywhere in the world. This in turn allowed for users to find others with similar tastes and interests to them and gather, creating large online groups for discussion and discourse to take place. This natural sorting of people into categories is further reinforced by social media platforms like Facebook and Twitter using algorithms to organise the vast amount of content. This function is implemented to show users content that is most likely to resonate and keep them engaged with the site. Although an important time saver in the attention economy a side effect of this function causes the development of echo chambers, in which users are only exposed to thoughts, ideas and content that do not challenge their current beliefs and confirm their own already established biases. The most obvious example of this polarising people to both extremes of the left and right politically. In this conference paper I will be discussing the issues surrounding social media platforms and their contribution to increased levels of developing echo chambers and confirmation bias, and discussing how and why this is important especially in the context of political discussion.

Social media platforms like Facebook and Twitter have had a strong influence in creating online Echo Chambers by showing specific content to specific users based on the content they have shown interest in.

The development of echo chambers begins through the behavioural process of users being more likely to stay engaged with content that reinforces views they agree with. This happens because “information or media messages that challenge people’s beliefs typically create dissonance, which is unpleasant and something most people want to avoid. The result of selective exposure is a reinforcement of individuals’ own beliefs.” (Karlsen, Steen-Johnsen, Wollebæk, & Enjolras, 2017).

This effect is exasperated when it comes to political bias as users are more susceptible to subconscious confirmation bias. This also happens because typically when a user engages with a community that shares a view on a single issue it can become difficult to go against the grain as bringing up valid counter points can lead to alienation and exclusion from the group. This is effect isn’t just caused by a group mentality but by something within human behavioural patterns that suggest that “when it comes to explicitly political issues, individuals are clearly more likely to pass on information that they have received from ideologically similar sources than to pass on information that they have received from dissimilar sources.” (Barberá, Jost, Nagler, Tucker, & Bonneau, 2015)

Due to the expansive nature of the internet and the possibilities that have been opened up by the change in internet dynamics described as web 2.0. Users now have the option of adapting their online environment to their own personal preferences, increasing the behavioural rut many people fall into that leads them to seek out information that confirms their existing bias. The idea that users are resistant to information that goes against their own established views is reinforced when Garret states that “awareness of other views is further enhanced by the fact that individuals tend to spend more time examining information sources that include opinion challenges. This is not to say that people are persuaded by this content—to the contrary, other studies have suggested that the additional time is spent critiquing the other perspectives—but the process does help to ensure political awareness.” (2009). This suggests that even when users are confronted with opposing ideas they are more likely to scrutinise it with bias and react to it in a way that strengthens their own current views on issues.

Although this behavioural pattern is an inherently human thing to do, the content algorithms developed into social media and the internet have added more momentum behind the issue. As Flaxman, Goel, & Rao “showed that articles found via social media or web-search engines are indeed associated with higher ideological segregation than those an individual reads by directly visiting news sites.” and this sort of behaviour can be seen happening before social media was around Flaxman, Goel, & Rao also suggest “that the vast majority of online news consumption mimicked traditional offline reading habits, with individuals directly visiting the home pages of their favourite, typically mainstream, news outlets.” (2016).

These are a few of the ways that echo chambers online have begun to crop up and grow in size. This sort of behaviour has been around since before social media and the internet but combination alongside algorithmic technologies unintended side effects which only exasperate the issue echo chamber communities.

Echo chambers not only have an effect on the shaping of communities online but the content created by those communities. As they are more likely to follow within the lines of what is considered acceptable in any particular community. This can lead to the limiting of creativity and turning the community itself into a rabbit hole in which casual participants are turned off by the extremism that can be produced.

There’s two types of users that exist within an echo chamber community environment. The first is the partisan user who primarily consume and perpetuate content in their own community, the gatekeeper. Gatekeepers are “users who are bipartisan consumers but partisan producers. These users lie in-between the two opposed communities in network terms, but side with one in content terms.”(Garimella, De Francisci Morales, Gionis, & Mathioudakis, 2018). Also discussed in their work is how bipartisan users are effected in the dynamics of separate conflicting echo chambers that they are invested within, “Overall, bipartisan users pay a price in terms of network centrality, community connection, and endorsements from other users (retweets, favourites). This is the first study to show the price of being bipartisan, especially in the context of political discussions forming echo chambers. This result highlights a worrying aspect of echo chambers, as it suggests the existence of latent phenomena that effectively stifle mediation between the two sides.” (Garimella, De Francisci Morales, Gionis, & Mathioudakis, 2018).

Even if a user decides to actively attempt to ensure they are not in an echo chamber the algorithms put in place can stop them from manually sorting content in a way that they’d like. There is a form of cognitive dissonance in which users believe that they are not in an online echo chamber but after a study done that made a select group follow opposing content creators the “results reveal a disconnect between belief and action. Participants who are asked to find their accounts in a sampled social network where nodes are coloured by inferred political ideology tend to increase their belief in how ideologically-cocooned they really are, but the political diversity of who they choose to follow on Twitter actually decreases several weeks after treatment.” (Gillani, Yuan, Saveski, Vosoughi, & Roy, 2018).

People are more willing to accept falsehoods or propaganda as the truth because of a form of peer pressure that is described by Khosravinik “The essence of Social Media is credibility gained by visibility/popularity as popularity equates commercial gain regardless of the consequences. This may work fine within commercial domains but when a similar logic is applied to the sphere of politics the results could be disastrous.” (2017).

This creates an effect where people are more likely to accept information because it has been accepted by so many other people who have already established themselves to be likeminded in the sense that they are involved within the same community. Bessi sums up this phenomena again by explaining how deceptive content can be perpetuated to masses who are already in agreement. “Being influenced by confirmation bias and selective exposure, they join virtual polarized communities wherein they reinforce their pre-existing beliefs.” (2016)

Echo chambers can change the way content is received and how credible it is perceived to be on an informational basis through the sheer number of people who also agree with the partisan issues it may be perpetuating.

The formation of echo chambers in such a great volume is dangerous to the fabric of society. Modern society is full of diversity and while this is something that gives strength, the polarisation of many groups can lead to the fragmentation of society.

This fragmentation occurs not only because of the algorithmic content delivery systems put in place, but because of the self-selection that many users engage in to refine the content placed in front of them. Törnberg proposes that “The possibility of self-segregation can therefore affect not only what the segregated users see, but also what perspectives non-segregated users are exposed to. This can occur as subtle and complex network dynamics of the interaction structure of social media can play into the diffusion dynamics, in ways that are not necessarily even understood by the developers of the media platforms.” (2018)

The fragmentation of society into different groups could have drastic effects on the stability of democracy as discussed by Dubois & Blank “There are two concerns about segmentation when it comes to political information and news. The first is a divide between those who are informed and those who are not informed about politics. The second is political polarization among those who exhibit at least minimal political interest or awareness. Since democratic political systems require people talk to each other to work out compromises and/or to become informed, the emergence of an echo chamber could have serious negative consequences.” (2018).

O’Hara & Stevens also discuss some of the consequences that could come about if echo chambers are allowed to develop in the way that they currently are. “First, there will be social fragmentation, as diverse groups polarise. Second, people will use available technology to create information goods tailored to themselves, rather than creating goods that are valuable for many people. Third, satisfaction of preferences will be taken as definitive of people’s well-being, ruling out alternative conceptions that take into account the possibility of extra value being provided by heterogeneous influences” (2015).

Although the solution to this problem is not as easy as changing inner workings of a social media platform to show a balance of content to users as this would result in platforms losing users to other platforms willing to only show acceptable content. Munson & Resnick highlight that fact when discussing this issue “From the perspective of website operators trying to attract and retain users, this is unlikely to be a desirable trade-off. It is unlikely to be sufficient challenge to satisfy diversity-seeking individuals, and would leave them vulnerable to losing challenge-averse individuals to competitors who offer 100% agreeable items all the time (and hence need no highlighting).” (2010)

These are a few of the issues that highlight why discussing the rise of many different online echo chambers is damaging to the overall fabric of society. As people can segregate themselves, they are cut off from hearing arguments that might alter their perception on political issues.

Echo chambers have become an important issue because they are creating a rift between individuals in society. These individuals who before would have to discuss their opinions with others around them in the real world who may not have the same opinion. Whereas now individuals can recede into an echo chamber online where their values and beliefs are not challenged and potentially pushed or lured into extremist views.

When people become so convinced and wrapped up in their ideology through an echo chamber the possibility of becoming and extreme advocate can occur. Although the online functions of being able to block or mute people are not feasible in the real world when in a heavy debate. This can lead to increased tensions between opposing groups.

A fair suggestion to solving this problem involves showing users things from the past or things that they may have in common with another person. Grevet, Terveen, & Gilbert “suggest opportunities to make weak ties more resilient. Calling attention to past interactions and shared interests could make common ground visible during arguments. These strategies, such as knowing when to step away, point to constant negotiations evolving around disagreements.” (2014).

Online and offline behaviours of these groups often affect each other in surprising ways for example Bright describes how “parties which are more politically successful offline are also typically more disconnected online, and it is significant because it shows that online fragmentation is not purely a result of decisions made by individuals online; the offline context has an impact.” (2016).

Also despite efforts to dampen the impact of political polarisation it seems that simply exposing people to opposing viewpoints is not the optimal solution as it only drives people to become more entrenched in their established views. In a field experiment “that offered a large group of Democrats and Republicans financial compensation to follow bots that retweeted messages by elected officials and opinion leaders with opposing political views. Republican participants expressed substantially more conservative views after following a liberal Twitter bot, whereas Democrats’ attitudes became slightly more liberal after following a conservative Twitter bot.” (Bail et al., 2018)

A major example of the effect echo chambers can have on the real world is primed in the idea of fake news. Due to many people not having the time to comb through all parts of every information, this leaves an opportunity for those with the trust and reputation to alter stories in a way to fit a particular narrative. Bakir & McStay suggest that the only way is to focus on the economic system underlying this process. “While a laudable variety of solutions to the deeply socially and democratically problematic contemporary fake news phenomenon have been proposed, each faces specific obstacles to achieving widespread implementation and impact. While we recognise the need for all these solutions to take root, our recommendation, to focus on digital advertising, addresses the contemporary phenomenon at its economic heart.” (2018)

These are a few of the ways that echo chambers online can have an effect offline. Most evident in the political and news segment of society. Although the effects of this phenomena can be seen before the rise of social media it has certainly increased the speed and intensity at which people are becoming more and more segregated from their local community.

In this conference paper I have endeavoured to discuss how echo chambers have begun to form more rapidly and why it is such an important issue to address. Echo chambers as an effect have been exasperated as a result of social media and it is having an ultimately negative effect. Social media platforms have indeed created a vast wealth of content to learn from or be entertained, but underneath the systems are changing society in a way that could be catastrophic in still yet unknown ways.

References

Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. B. F., … Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarization. Proceedings of the National Academy of Sciences, 115(37), 9216–9221. https://doi.org/10.1073/pnas.1804840115

Bakir, V., & McStay, A. (2018). Fake News and The Economy of Emotions: Problems, causes, solutions. Digital Journalism, 6(2), 154–175. https://doi.org/10.1080/21670811.2017.1345645

Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber? Psychological Science, 26(10), 1531–1542. https://doi.org/10.1177/0956797615594620

Bessi, A. (2016). Personality traits and echo chambers on facebook. Computers in Human Behavior, 65, 319–324. https://doi.org/10.1016/j.chb.2016.08.016

Bright, J. (2016). Explaining the Emergence of Echo Chambers on Social Media: The Role of Ideology and Extremism. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2839728

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: the moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656

Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter Bubbles, Echo Chambers, and Online News Consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006

Garimella, K., De Francisci Morales, G., Gionis, A., & Mathioudakis, M. (2018). Political Discourse on Social Media: Echo Chambers, Gatekeepers, and the Price of Bipartisanship. Proceedings of the 2018 World Wide Web Conference on World Wide Web  – WWW ’18, 913–922. https://doi.org/10.1145/3178876.3186139

Garrett, R. K. (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users. Journal of Computer-Mediated Communication, 14(2), 265–285. https://doi.org/10.1111/j.1083-6101.2009.01440.x

Gillani, N., Yuan, A., Saveski, M., Vosoughi, S., & Roy, D. (2018). Me, My Echo Chamber, and I: Introspection on Social Media Polarization. Proceedings of the 2018 World Wide Web Conference on World Wide Web  – WWW ’18, 823–831. https://doi.org/10.1145/3178876.3186130

Grevet, C., Terveen, L. G., & Gilbert, E. (2014). Managing political differences in social media. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing – CSCW ’14, 1400–1408. https://doi.org/10.1145/2531602.2531676

Karlsen, R., Steen-Johnsen, K., Wollebæk, D., & Enjolras, B. (2017). Echo chamber and trench warfare dynamics in online debates. European Journal of Communication, 32(3), 257–273. https://doi.org/10.1177/0267323117695734

Khosravinik, M. (2017). Right Wing Populism in the West: Social Media Discourse and Echo Chambers. Insight Turkey, 19(3), 53–68. https://doi.org/10.25253/99.2017193.04

Munson, S. A., & Resnick, P. (2010). Presenting diverse political opinions: how and how much. Proceedings of the 28th International Conference on Human Factors in Computing Systems – CHI ’10, 1457. https://doi.org/10.1145/1753326.1753543

O’Hara, K., & Stevens, D. (2015). Echo Chambers and Online Radicalism: Assessing the Internet’s Complicity in Violent Extremism: The Internet’s Complicity in Violent Extremism. Policy & Internet, 7(4), 401–422. https://doi.org/10.1002/poi3.88

Ridings, C. M., & Gefen, D. (2006). Virtual Community Attraction: Why People Hang Out Online. Journal of Computer-Mediated Communication, 10(1), 00–00. https://doi.org/10.1111/j.1083-6101.2004.tb00229.x

Törnberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLOS ONE, 13(9), e0203958. https://doi.org/10.1371/journal.pone.0203958

3 thoughts on “The Societal Effects of Online Echo Chambers

  1. Hi there!

    I really liked your choice of topic! It’s something that I’m personally quite interested in, and I think it’s important for users online to think about how what we’re seeing online isn’t necessarily an accurate reflection of the different opinions and beliefs in the real world, as our feeds are so heavily tailored to suit our likes.

    I think it was great that you touched on a suggestion for a solution for perhaps eliminating these echo chambers, but would love to have heard more. Social networking sites, content sharing platforms and news sites are rife with algorithms and while they play a huge part in determining what shows up on our screens, I personally can’t see that changing any time soon. What do you think are some other ways we can combat echo chambers? Is it the responsibility of those creating the software to diversify our feeds or us as users to try and engage with content that is outside of our usual likes and beliefs?

    It would’ve been great to have a bit more of a discussion around the quotes you used to unpack them a bit and challenge what the authors are saying or build a stronger argument.

    In text referencing was not done in APA format. When you quote the author, the year needs to be in brackets directly after the author’s name, not at the end of the paragraph. I completely understand the confusion though, as the in text citation would otherwise be at the end of the paragraph with the authors name and year if it were paraphrased information.

    Hope you enjoy the rest of the conference!

    Alison

  2. Hi JPetch,
    This was a truly interesting and thought-provoking paper! I had not considered the extent to which algorithms could impact society! Prior to reading your paper, I had only thought about the use of algorithms on Netflix and how it creates a bubble of entertainment making it harder for viewers to find content outside of their typical preferences. I had thought this to be quite restricting, but in comparison to the issues you address here, it seems like a miniscule problem. I was especially drawn to your discussion of selective exposure, cognitive dissonance, the fact that when users are exposed to differing opinions, they merely use it to reinforce their own, and the development of extremism.
    Based on my own experiences, during the 2016 election, I was seeing a lot of anti-Trump views being shared on my social media platforms. I had noticed that this was due to Facebook pages I had interacted with and Twitter users I was following. As I was aware that I was being exposed to quite extreme views and thus, to an extent, the development of a sort of echo chamber, it raises the question of whether most users are aware of this happening and try to challenge it, or simply accept it and do not question it? What are your thoughts on this?

    Looking forward to your reply,
    Thanks,
    Devyn 🙂
    If you’re interested, check out my paper within the stream Communities and Web 2.0 here https://networkconference.netstudies.org/2019Curtin/2019/05/05/active-now-how-web-2-0-allows-for-the-formation-of-online-communities-capable-of-initiating-change-through-activism/

  3. Hello there JPetch,

    Your title intrigued me on first sight as I had no idea as to what was Echo Chambers. It proved to be an instructive reading and is a very well-structured paper. The algorithms that define our preferences and what we see on online platforms would not be changing any time soon.

    Your paper made me think deeply on the matter and I have some interrogations :

    1. If we see only things in our particular fields of interests, don’t you think we are missing out on other information. These algorithms keep us in information bubbles?

    2. You also stated that certain things that appear on social networking sites’ users feed may not reflect reality. Do that make the perception of reality of some users more difficult?

    Regards,
    Keshav

    You can have a look on my paper on the following link : https://networkconference.netstudies.org/2019Curtin/2019/05/09/social-media-influencers-defining-construction-of-identit/

Leave a Reply to KShreedam Cancel reply

Your email address will not be published. Required fields are marked *