Introduction:
Since its commonly agreed inception with the launch of MySpace in 2003, social media has expanded its capabilities and accessibility significantly, and with that expansion has come an exponential growth in active users, Facebook alone growing from 100 million in 2008 to over 3 billion in 2023 (Statista, 2024). Created by Mark Zuckerberg, Facebook was originally a networking application for strictly Harvard University students (Hansell, 2006). In 2024, Facebook has evolved into a multinational technology conglomerate and what was once a small online community for Harvard students offers globally accessible videos, marketplaces and unlimited information. Facebook is not alone, as anywhere you turn of the internet nowadays, there is access to information from and communication with people all over world. The growing number of social media platforms has created a demand for users’ attention, companies are using intrusive algorithms to compete for people’s online attention, a concept known as the ‘attention economy’ (Nelson-Field, 2020). A study by Garrett (2009) found evidence from early in the social media era that people preferred to consume information that conformed to their personal beliefs. As algorithms push content on people that it believes will keep them scrolling, users are constantly being fed ideas that do not challenge the individual viewpoints and over time, a ‘filter bubble’ (Geschke et al., 2018) is created where people are not forced to consider alternative opinions on social and political issues. The attention economy has created an environment where critical thinking has been polarised and filter bubbles as well as echo chambers are poisoning minds for the tech-company profits.
Discussion:
The rapid expansion of social media has created a demand for user attention, a concept that can be described as the attention economy. Davenport & Beck (2001) explored the concept in as early as 2001 in relation to “organisational ADD” (p. 1) in relation to how the “assault of information” (p. 1) makes the scarcity of attention palpable. With the emergence of short-form content, especially the rise of TikTok, human attention spans have decreased over the past 25 years shown by Mark’s (2023) study. The increased prevalence and sheer volume of social digital media applications combined with shortening human attention spans has created an economy for human attention. Companies are competing to keep users on their applications for as long as possible, boosting user statistics and in turn advertisement revenue, the biggest moneymaker for social media companies (McFarlane, 2022). Complex algorithms that take into account your online activity, perceived personality and usage patterns (Orlowski, 2020) are used to best determine which content to show in order to maximise attention received from the user. It is very difficult to believe given the intrusiveness of these algorithms and past events such as the 2021 Facebook data leak that these companies have any motive for maintaining user attention other than profit, best put by Sophocles in Orlowski’s 2020 film, “If you’re not paying for the product, you are the product.”
Because of the sophistication of algorithms, users often find themselves in ‘filter bubbles’, isolation from information and opinions a user hasn’t expressed interest in. Filter bubbles prevent inclusive political discussion because they remove channels in which opposing viewpoints may clash (Bozdag & van den Hoven, 2015). By limiting online discourse between opposing viewpoints, users may become polarised in their approach to certain issues they have been ‘bubbled’ from, struggling to even see reason from other perspectives and creating a social divide. As this happens on a broader scale, democracy can suffer as a result of a decreased quality and diversity of information available to certain people and groups (Bozdag & van den Hoven, 2015), as democracy generally requires a diversity of decision making and beliefs. Individuals are often completely oblivious to the fact that they are in a filter bubble, it can be described as ‘invisible’. People who are being curated by algorithms only receive news or information from a certain viewpoint and are often not even exposed to other trains of thought, creating an unrealistic perception of reality. Filter bubbles can affect the way we live in the real world, Parisier in 2011 explains “To be the author of your life, you have to be aware of a diverse array of options and lifestyles” (p. 18), for a chronically online person, algorithms may not paint the entire picture. Creativity is often generated by the coming-together of concepts from different disciplines and cultures and inside a filter bubble, there is less opportunity for chance encounters to bring create insight (Parisier, 2011). Perhaps the best quote by Parisier demonstrating the division created by filter bubbles is “In an age when shared information is the bedrock of shared experience, the filter bubble is a centrifugal force, pulling us apart” (2011, p, 14).
The evolution of online networks in the 21st century coinciding with an era of social change has created an online vacuum of echo chambers. In 2004, Plant described online communities as “driven by the human desire for connection, knowledge and information” (para. 6), but 20 years later online networks far further reaching and expansive. There is an abundance of different types of online networks ranging from topics such as gaming, professional networking and art to name a few of many. Applications such as reddit and discord have made it free and easy for anybody to access content and connect with people in relation to just about any topic imaginable. The growth in range and accessibility to communities has coincided with a lot of progressive social changes across the world. The LGBTQ movement has seen same-sex marriage legalised in Australia in 2017 and the United States in 2015, the MeToo movement started in 2006 and popularised in 2017 are examples of courses that have greatly benefited from safe online networks making activism as easy as ever. However, with the increased prevalence of online networks has come creation of echo chambers described by Geschke et al. (2019) as “a social phenomenon where filter bubbles of interacting individuals strongly overlap” (para. 2). What results is an ‘echo effect’ where participants in a network only encounter information that reinforces personal beliefs, blocking differing opinions. Echo chambers have a similar effect on individuals as filter bubbles, creating an environment where groups of beliefs are not challenged and opposing groups can become socially divided. Echo chambers have especially had an effect on U.S political policy over the last 10 years. The rise of Donald Trump saw an increase in online far-right communities and in counter to Trumps governance, far-left communities have become prevalent as well. Elon Musk purchasing twitter, now known as X, in 2022 saw a dramatic reduction in content moderation on one of the biggest discussion platforms on the internet, enabling extremist groups on both sides to thrive. The issue isn’t these groups exist online, but that users are being trapped in ‘echo chambers’, where extremist viewpoints are not challenged by alternative beliefs, polarising the thinking of users who get trapped, especially in regards to prevalent issues such as abortion and vaccines. The increase in polarising viewpoints, particularly on twitter and shown in a study by Garimella & Weber (2017) has had a flow on effect on U.S politicians. A study by UCLA Social Sciences (2020) showed that polarisation in policy between Republicans and Democrats, mostly emphasised by Republicans is at a 200-year high. A poll by Raine et al. (2019) showed that 64% of Americans’ trust in ‘each other’ has shrunk as well as 75% saying they’re trust in the Federal Government has shrunk. Another epidemic created by echo chambers is that of misinformation, a study by Cinelli et al. (2021) showing “when polarisation is high, misinformation quickly proliferates” (para. 24). Repeated exposure to misinformation or ‘fake news’ will further polarise an individual’s thought process, sinking them deeper into an echo chamber.
The increase in polarised thinking among the general public has given way to the seemingly unstoppable rise of populism in the Western World. The election on Donald Trump in 2016, Brexit in the United Kingdom and One Nation in Australia. In Western Europe populist parties now attract 25% of votes in elections compared to 5-7% in the 1990s (Flew & Iosifidis, 2019). Multinational stalwarts such as NATO and the European Union’s effectiveness has been called into question. The rise of populism and coincided hand-in hand with rise of social media being used as a mean of political discussion (Geschke et al., 2018) and it is no coincidence that Donald Trump and the Brexit campaign heavily used social media for their messaging, both campaigns being successful. The question to be asked is not necessarily whether populism is good or bad, but perhaps if populism has dumbed down social and political discussion. Taking the ‘popular’ viewpoint to many can be perceived as the ‘easy way out’ instead of having to critically evaluate the options using reliable sources of information. During 2021-2022 era of covid, there was ample debate about the necessity and effectiveness of the covid vaccines, however, rarely did it seem that sound arguments were used from both sides, instead hollow gaslighting and emotional manipulation seemed to be a popular tactic by celebrities and general populous alike. The emergence of filter bubbles and echo chambers has made it easier for people to justify weakly supported viewpoints due to constant positive affirmations of one’s beliefs. There does not seem to be an obvious solution to issues filter bubbles and echo chambers create. Parisier (2011) suggested sabotaging personalisation systems by erasing web history, deleting cookies, using lesser-known search engines and attempting to fool personalisation algorithms, however these are tedious solutions would be nearly impossible to implement on a broad-scale. It is unlikely that Federal Governments interfere with the operations of private companies. The only way to break free from the shackles of filter bubbles and echo chambers would be to not use social media, but this is not an option for many and there are inherent positives to social media that would be missed out upon. A more reasonable solution would be for an individual to monitor their use of social media, being mindful of the harmful intentions of big-tech companies are not being naive to the prowess of algorithms.
Conclusion:
Social media has connected people all over the world through many mediums of communication. Over time, social media companies have innovated, constantly expanding features to attract and retain the attention of users. The abundance of online networks that exist has enabled like-minded people to share ideas and passions, however intrusive algorithms have created toxic and divisive online environments that have flowed into real world social and political issues. Filter bubbles and echo chambers, created as a result of algorithms and tech companies’ relentless drive for usership and profit, have contributed to a rise in populism in recent years, largely causing critical social and political thinking to deteriorate. It can be expected that tech companies will continue to innovate their processes to be as efficient as possible, and for now it doesn’t appear if there is a straightforward solution to escape the effects of artificial environments, however continued education of malicious tactics of big tech companies and unifying political leadership may be helpful to more moderate schools of thought to the mainstream.
References
Bozdag, E., & van den Hoven, J. (2015, December 15). Breaking the filter bubble: democracy and design. Ethics Inf Technol 17, pp. 249-265.
CInelli, M., De Francisi Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Oslo: University of Oslo.
Davenport, T. H., & Beck, J. C. (2001, April). The Attention Economy. A New Perspective on Business: Welcome to the Attention Economy, p. Chapter 1.
Division, U. S. (2020). Parties > Parties Overview. Retrieved from votetoview.com: https://voteview.com/parties/all
Flew, T., & Iosifidis, P. (2020). Populism, globalisation and social media. International Commuincation Gazette, 7-25.
Garimella, V. R., & Weber, I. (2017). A Long-Term Analysis of Polarization on Twitter. https://arxiv.org/pdf/1703.02769.pdf.
Garrett, R. K. (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users. Journal of Computer-Mediated Communication, 265-285.
Geschke, D., Lorenz, J., & Holtz, P. (2018). The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. British journal of social psychology, 129-149.
Hansell, S. (2006, September 12). Site Previously for Students Will Be Opened to Others. The New York Times.
Mark, G. (2023). Attention Span. Haper Collins.
McFarlane, G. (2022, December 2). How Facebook (Meta), X Corp (Twitter), Social Media Make Money From You. Retrieved from Investopedia: https://www.investopedia.com/stock-analysis/032114/how-facebook-twitter-social-media-make-money-you-twtr-lnkd-fb-goog.aspx
Nelson-Field, K. (2020). The Attention Economy and How Media Works: Simple Truths for Marketers. Singapore: Springer.
Number of monthly active Facebook users worldwide as of 4th quarter 2023. (2024). Retrieved from Statista: https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/
Orlowski, J. (Director). (2020). The Social Dilemma [Motion Picture].
Pariser, E. (2011). The Filter Bubble: What The Internet Is Hiding From You. London: Penguin Books.
Plant, R. (2004). Online communities. Technology in Society, 51-65.
Raine, L., Keeter, S., & Perrin, A. (2019, July 22). Trust and Distrust in America. Retrieved from Pew Research Center: https://www.pewresearch.org/politics/2019/07/22/trust-and-distrust-in-america/
Leave a Reply
You must be logged in to post a comment.