How Algorithms and Corporate Greed are Creating Unhealthy Online Environments

Posted on

by


Introduction:

Since its commonly agreed inception with the launch of MySpace in 2003, social media has expanded its capabilities and accessibility significantly, and with that expansion has come an exponential growth in active users, Facebook alone growing from 100 million in 2008 to over 3 billion in 2023 (Statista, 2024). Created by Mark Zuckerberg, Facebook was originally a networking application for strictly Harvard University students (Hansell, 2006). In 2024, Facebook has evolved into a multinational technology conglomerate and what was once a small online community for Harvard students offers globally accessible videos, marketplaces and unlimited information. Facebook is not alone, as anywhere you turn of the internet nowadays, there is access to information from and communication with people all over world. The growing number of social media platforms has created a demand for users’ attention, companies are using intrusive algorithms to compete for people’s online attention, a concept known as the ‘attention economy’ (Nelson-Field, 2020). A study by Garrett (2009) found evidence from early in the social media era that people preferred to consume information that conformed to their personal beliefs. As algorithms push content on people that it believes will keep them scrolling, users are constantly being fed ideas that do not challenge the individual viewpoints and over time, a ‘filter bubble’ (Geschke et al., 2018) is created where people are not forced to consider alternative opinions on social and political issues. The attention economy has created an environment where critical thinking has been polarised and filter bubbles as well as echo chambers are poisoning minds for the tech-company profits.

 

Discussion:

The rapid expansion of social media has created a demand for user attention, a concept that can be described as the attention economy. Davenport & Beck (2001) explored the concept in as early as 2001 in relation to “organisational ADD” (p. 1) in relation to how the “assault of information” (p. 1) makes the scarcity of attention palpable. With the emergence of short-form content, especially the rise of TikTok, human attention spans have decreased over the past 25 years shown by Mark’s (2023) study. The increased prevalence and sheer volume of social digital media applications combined with shortening human attention spans has created an economy for human attention. Companies are competing to keep users on their applications for as long as possible, boosting user statistics and in turn advertisement revenue, the biggest moneymaker for social media companies (McFarlane, 2022). Complex algorithms that take into account your online activity, perceived personality and usage patterns (Orlowski, 2020) are used to best determine which content to show in order to maximise attention received from the user. It is very difficult to believe given the intrusiveness of these algorithms and past events such as the 2021 Facebook data leak that these companies have any motive for maintaining user attention other than profit, best put by Sophocles in Orlowski’s 2020 film, “If you’re not paying for the product, you are the product.”

 

Because of the sophistication of algorithms, users often find themselves in ‘filter bubbles’, isolation from information and opinions a user hasn’t expressed interest in. Filter bubbles prevent inclusive political discussion because they remove channels in which opposing viewpoints may clash (Bozdag & van den Hoven, 2015). By limiting online discourse between opposing viewpoints, users may become polarised in their approach to certain issues they have been ‘bubbled’ from, struggling to even see reason from other perspectives and creating a social divide. As this happens on a broader scale, democracy can suffer as a result of a decreased quality and diversity of information available to certain people and groups (Bozdag & van den Hoven, 2015), as democracy generally requires a diversity of decision making and beliefs. Individuals are often completely oblivious to the fact that they are in a filter bubble, it can be described as ‘invisible’. People who are being curated by algorithms only receive news or information from a certain viewpoint and are often not even exposed to other trains of thought, creating an unrealistic perception of reality. Filter bubbles can affect the way we live in the real world, Parisier in 2011 explains “To be the author of your life, you have to be aware of a diverse array of options and lifestyles” (p. 18), for a chronically online person, algorithms may not paint the entire picture. Creativity is often generated by the coming-together of concepts from different disciplines and cultures and inside a filter bubble, there is less opportunity for chance encounters to bring create insight (Parisier, 2011). Perhaps the best quote by Parisier demonstrating the division created by filter bubbles is “In an age when shared information is the bedrock of shared experience, the filter bubble is a centrifugal force, pulling us apart” (2011, p, 14).

 

The evolution of online networks in the 21st century coinciding with an era of social change has created an online vacuum of echo chambers. In 2004, Plant described online communities as “driven by the human desire for connection, knowledge and information” (para. 6), but 20 years later online networks far further reaching and expansive. There is an abundance of different types of online networks ranging from topics such as gaming, professional networking and art to name a few of many. Applications such as reddit and discord have made it free and easy for anybody to access content and connect with people in relation to just about any topic imaginable. The growth in range and accessibility to communities has coincided with a lot of progressive social changes across the world. The LGBTQ movement has seen same-sex marriage legalised in Australia in 2017 and the United States in 2015, the MeToo movement started in 2006 and popularised in 2017 are examples of courses that have greatly benefited from safe online networks making activism as easy as ever. However, with the increased prevalence of online networks has come creation of echo chambers described by Geschke et al. (2019) as “a social phenomenon where filter bubbles of interacting individuals strongly overlap” (para. 2). What results is an ‘echo effect’ where participants in a network only encounter information that reinforces personal beliefs, blocking differing opinions. Echo chambers have a similar effect on individuals as filter bubbles, creating an environment where groups of beliefs are not challenged and opposing groups can become socially divided. Echo chambers have especially had an effect on U.S political policy over the last 10 years. The rise of Donald Trump saw an increase in online far-right communities and in counter to Trumps governance, far-left communities have become prevalent as well. Elon Musk purchasing twitter, now known as X, in 2022 saw a dramatic reduction in content moderation on one of the biggest discussion platforms on the internet, enabling extremist groups on both sides to thrive. The issue isn’t these groups exist online, but that users are being trapped in ‘echo chambers’, where extremist viewpoints are not challenged by alternative beliefs, polarising the thinking of users who get trapped, especially in regards to prevalent issues such as abortion and vaccines. The increase in polarising viewpoints, particularly on twitter and shown in a study by Garimella & Weber (2017) has had a flow on effect on U.S politicians. A study by UCLA Social Sciences (2020) showed that polarisation in policy between Republicans and Democrats, mostly emphasised by Republicans is at a 200-year high. A poll by Raine et al. (2019) showed that 64% of Americans’ trust in ‘each other’ has shrunk as well as 75% saying they’re trust in the Federal Government has shrunk. Another epidemic created by echo chambers is that of misinformation, a study by Cinelli et al. (2021) showing “when polarisation is high, misinformation quickly proliferates” (para. 24). Repeated exposure to misinformation or ‘fake news’ will further polarise an individual’s thought process, sinking them deeper into an echo chamber.

 

The increase in polarised thinking among the general public has given way to the seemingly unstoppable rise of populism in the Western World. The election on Donald Trump in 2016, Brexit in the United Kingdom and One Nation in Australia. In Western Europe populist parties now attract 25% of votes in elections compared to 5-7% in the 1990s (Flew & Iosifidis, 2019). Multinational stalwarts such as NATO and the European Union’s effectiveness has been called into question. The rise of populism and coincided hand-in hand with rise of social media being used as a mean of political discussion (Geschke et al., 2018) and it is no coincidence that Donald Trump and the Brexit campaign heavily used social media for their messaging, both campaigns being successful. The question to be asked is not necessarily whether populism is good or bad, but perhaps if populism has dumbed down social and political discussion. Taking the ‘popular’ viewpoint to many can be perceived as the ‘easy way out’ instead of having to critically evaluate the options using reliable sources of information. During 2021-2022 era of covid, there was ample debate about the necessity and effectiveness of the covid vaccines, however, rarely did it seem that sound arguments were used from both sides, instead hollow gaslighting and emotional manipulation seemed to be a popular tactic by celebrities and general populous alike. The emergence of filter bubbles and echo chambers has made it easier for people to justify weakly supported viewpoints due to constant positive affirmations of one’s beliefs. There does not seem to be an obvious solution to issues filter bubbles and echo chambers create. Parisier (2011) suggested sabotaging personalisation systems by erasing web history, deleting cookies, using lesser-known search engines and attempting to fool personalisation algorithms, however these are tedious solutions would be nearly impossible to implement on a broad-scale. It is unlikely that Federal Governments interfere with the operations of private companies. The only way to break free from the shackles of filter bubbles and echo chambers would be to not use social media, but this is not an option for many and there are inherent positives to social media that would be missed out upon. A more reasonable solution would be for an individual to monitor their use of social media, being mindful of the harmful intentions of big-tech companies are not being naive to the prowess of algorithms.

 

Conclusion:

Social media has connected people all over the world through many mediums of communication. Over time, social media companies have innovated, constantly expanding features to attract and retain the attention of users. The abundance of online networks that exist has enabled like-minded people to share ideas and passions, however intrusive algorithms have created toxic and divisive online environments that have flowed into real world social and political issues. Filter bubbles and echo chambers, created as a result of algorithms and tech companies’ relentless drive for usership and profit, have contributed to a rise in populism in recent years, largely causing critical social and political thinking to deteriorate. It can be expected that tech companies will continue to innovate their processes to be as efficient as possible, and for now it doesn’t appear if there is a straightforward solution to escape the effects of artificial environments, however continued education of malicious tactics of big tech companies and unifying political leadership may be helpful to more moderate schools of thought to the mainstream.

 

References

Bozdag, E., & van den Hoven, J. (2015, December 15). Breaking the filter bubble: democracy and design. Ethics Inf Technol 17, pp. 249-265.

CInelli, M., De Francisi Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Oslo: University of Oslo.

Davenport, T. H., & Beck, J. C. (2001, April). The Attention Economy. A New Perspective on Business: Welcome to the Attention Economy, p. Chapter 1.

Division, U. S. (2020). Parties > Parties Overview. Retrieved from votetoview.com: https://voteview.com/parties/all

Flew, T., & Iosifidis, P. (2020). Populism, globalisation and social media. International Commuincation Gazette, 7-25.

Garimella, V. R., & Weber, I. (2017). A Long-Term Analysis of Polarization on Twitter. https://arxiv.org/pdf/1703.02769.pdf.

Garrett, R. K. (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users. Journal of Computer-Mediated Communication, 265-285.

Geschke, D., Lorenz, J., & Holtz, P. (2018). The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. British journal of social psychology, 129-149.

Hansell, S. (2006, September 12). Site Previously for Students Will Be Opened to Others. The New York Times.

Mark, G. (2023). Attention Span. Haper Collins.

McFarlane, G. (2022, December 2). How Facebook (Meta), X Corp (Twitter), Social Media Make Money From You. Retrieved from Investopedia: https://www.investopedia.com/stock-analysis/032114/how-facebook-twitter-social-media-make-money-you-twtr-lnkd-fb-goog.aspx

Nelson-Field, K. (2020). The Attention Economy and How Media Works: Simple Truths for Marketers. Singapore: Springer.

Number of monthly active Facebook users worldwide as of 4th quarter 2023. (2024). Retrieved from Statista: https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/

Orlowski, J. (Director). (2020). The Social Dilemma [Motion Picture].

Pariser, E. (2011). The Filter Bubble: What The Internet Is Hiding From You. London: Penguin Books.

Plant, R. (2004). Online communities. Technology in Society, 51-65.

Raine, L., Keeter, S., & Perrin, A. (2019, July 22). Trust and Distrust in America. Retrieved from Pew Research Center: https://www.pewresearch.org/politics/2019/07/22/trust-and-distrust-in-america/


Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XV Feedback Form.

Comments

14 responses to “How Algorithms and Corporate Greed are Creating Unhealthy Online Environments”

  1. dale_b Avatar
    dale_b

    Hi Zac,

    You have done a wonderful job researching and writing your paper and it reminded me of the Cambridge Analytica scandal and where Facebook user data was successfully used for Trumps’s political gain.

    I agree with what you said … “populism has dumbed down social and political discussion”. The dumbing down of discourse undermines critical thinking and it’s crucial for users to critically evaluate information and look for diverse perspectives. Social media platforms should be more responsible, and address the spread of misinformation and create environments conducive to informed and constructive debate.

    Should policymakers and social media platforms continue to allow the dissemination of misinformation, the negative effects on society will continue to grow.

    Can you think of additional negative effects not mentioned in your paper that may occur?

    Kind Regards,

    Dale B.

    1. Zac Reed Avatar
      Zac Reed

      Thanks for the reply Dale,

      Ironically since I wrote my paper there was much discussion from prominent political figures and social media in regards to censorship and the Government’s role in the control of information, Anthony Albanese and Elon Musk noticeably got into a war of words over the Government’s request to remove graphic video on X relating to the Sydney stabbings.

      A perfect example of a negative effect can be derived from that Sydney incident, following the attack there was rapid and widespread misinformation regarding the ethnicity of the attacker. Assumptions made by people online spread to racial profiling, abuse and a broader debate about immigration policy, in the matter of hours after the attack. Algorithms and content moderators were slow to react, something that appears to happen too often when it comes targeted racial and sexual abuse online.

      I believe the onus of preventing this from happening mostly falls on social media platforms. Algorithms designed to retain attention push harmful content towards users. There should be a level of responsibility to protect users from such content however I’m not sure how this responsibility can be enforced.

  2. niracaro Avatar
    niracaro

    Great article Zak! It’s fascinating to see how social media has evolved over the years, from its humble beginnings with platforms like MySpace to the massive conglomerates we have today like Facebook. The exponential growth in active users and the expansion of capabilities have truly transformed how we connect and consume information online. The concept of the attention economy is quite intriguing, highlighting how companies are competing for users’ attention through algorithms that tailor content to individual preferences. This personalized content can create filter bubbles, limiting exposure to diverse viewpoints. It’s essential for users to be aware of these dynamics and actively seek out a variety of perspectives to avoid being trapped in echo chambers. Each point was well dressed and supported, in particular I liked your paragraph on the “echo affect.” Very insightful!

    I’ve also created a blog on social media and it’s changes hope you can have a look too!
    https://networkconference.netstudies.org/2024/csm/3079/the-imppact-of-tiktok-algorithm-on-the-fashion-industry/

    1. Zac Reed Avatar
      Zac Reed

      Thanks for the reply,

      Absolutely it is important for people to seek out diverse viewpoints, however the unfortunate of nature personalised content makes this increasingly difficult. I think increased education at the primary and early secondary levels about algorithms operate and the poor intentions of tech companies could be useful to counteract the effects of echo chambers and filter bubbles.

  3. El Ashcroft Avatar
    El Ashcroft

    Interesting read, I’m having flashbacks to my first social media use which was MySpace. I remember thinking, why do we need another platform, when Facebook first because a thing for the mass public but now I don’t even question when another one pops up because it is probably more about the attention economy than it is about consumer needs.

    I enjoyed your discussion about filter bubbles and echo chambers. With the way social media platforms use these and considering the decrease of human attention spans do you think there is a way people can escape these and will they want to?

    1. Zac Reed Avatar
      Zac Reed

      Thanks for the comment El,

      I think people can absolutely escape online filter bubbles and echo chambers, I did it myself with a conscious and sustained effort. The problem is as you mentioned, there is no motivation or reason for a lot of people to. They’re designed to be addictive and the downsides of social media overuse aren’t as easily identifiable as other addictive habits such as nicotine or gambling.

      Given that short-form content like TikTok and Reels are still relatively new, it will probably take time to see the negative effects of overuse. I suspect attention spans will continue to shorten and we will probably see interpersonal skills in young people worsen over time.

  4. Douglas Baker Avatar
    Douglas Baker

    Thank you for your paper Zac – can we really expect any different behaviour by social media companies when they are themselves under pressure from shareholders to provide profits every year? Yes their algorithms may make breaking out of a filter bubble difficult, but at what point does it become the responsibility of the user of the platform to seek information that may challenge their existing viewpoints?

    Regards
    Douglas Baker

    1. Zac Reed Avatar
      Zac Reed

      Thanks Douglas,

      I definitely thinks there needs to be more primary and secondary education around dangers of algorithms and filter bubbles ect. Its important for young people to at least be aware of some of the traps and dangers of social media overuse.

      1. Douglas Baker Avatar
        Douglas Baker

        Wondering what your view is on services such as Ground News (https://ground.news/landingV2/welcome) for at least attempting to break out of the filter bubbles and echo chambers?

        1. Zac Reed Avatar
          Zac Reed

          Sites like that are great, but I think the problem with filter bubbles and echo chambers is that people who are actively trapped inside of them have very little motivation to seek out alternate sources. Doesn’t help that Ground News is behind a paywall as well.

          If sites like Ground News and All Sides can market themselves into the forefront of news media, or somehow reach people who maybe only get their news from polarised sources like CNN or Fox News, I think that would be great for broad-scale social and political discussion.

  5. Faisal Al Zubaidi Avatar
    Faisal Al Zubaidi

    Hi Zac,

    Thank you for sharing your paper. It fits quite well with mine! Yours about the algorithms and mine on their negative effects. I think regulations should be put in place to limit the unhinged influence of algorithms. If you were a member of the parliament, what new laws may you apply to reduce the negative impact of social media algorithms?

    1. Zac Reed Avatar
      Zac Reed

      I’m not sure that Government regulation is the best way to negate the negative effects of intrusive algorithms. I think a better solution is increased education around the nature and purpose of algorithms, particularly at the late primary/early secondary level. I suppose if I was a member of Parliament I would lobby the School Curriculum and Standards Authority to implement this.

      1. Faisal Al Zubaidi Avatar
        Faisal Al Zubaidi

        Hi Zac

        That could be a possible solution. I’m not sure whether it would be implemented though due to the increasing study load and the nature of free will. It could be utilised to plant a seed within individuals though, so that when they do notice hardships, they can respond appropriately.

  6. 20668255 Avatar
    20668255

    Hi Zac,

    I really enjoyed your comprehensive analysis on how algorithms and corporate greed are creating unhealthy online environments! Particularly, I liked how your paper connects filter tools and echo chambers to real world impacts such as misinformation and decreased trust. It’s interesting to also see a different perspective on social medias impacts that aren’t as positive. It shows that social media is an impactful tool, however, impact may also cause harm in an unhealthy way.

    How do you think users can effectively mitigate the effects of these challenges addressed?

Leave a Reply

Skip to content