Addressing Information Overload and Algorithmic Influence In Digital Era

Posted on

by


Introduction

In the contemporary digital era, the advances in digital technology have revolutionized how people communicate. The widespread availability of the Internet has changed the way information is accessed, enabling people effortless access to a wide range of information from anywhere. Balaji et al. (2021) propose that textual communication has emerged as the primary mode of interaction, with over 18.2 million messages transmitted every minute in this dynamic era. However, the rapid development of digital technology has led to the phenomenon of information overload. As people explore the vast amount of online content, the challenge of filtering out relevant information becomes increasingly overwhelming.

To address the challenge of information overload, social media platforms have turned to personalize algorithms solutions to address and enhance user experiences. These algorithms analyze user data and behaviour to develop specific content for the user through self-selected or pre-selected personalization methods (Borgesius et al., 2016). However, personalize algorithms constrict users’ exposure to diverse content by strengthening their existing beliefs and perspectives (Berman and Katona, 2020). This situation in algorithmic curation extends to the emergence of the echo chamber and contributes its growth (Cinelli et al., 2021). Furthermore, the complexity surrounding algorithmic influence extend beyond mere personalization, and its ability to manipulate user behaviour and exacerbate addiction issues has raises concerns (Smith and Short, 2022). Among these challenges, social media platforms like TikTok are prominent examples of algorithmic-driven engagement, featuring advanced systems customised to individual preferences (Bucher, 2020).

This paper focuses on exploring concepts related to information overload, algorithmic personalization, echo chambers, and filter bubbles and their impact on user behaviour and cognition in digital environments. Its aim is to highlight the complexity of algorithmic influence in the digital era and emphasize the importance of critical evaluation in digital media consumption.

 

Addressing information overload online in the digital era

The widespread adoption of the Internet and the rapid development of digital technology has created a contemporary dynamic era. Individuals have the freedom to access and gather vast amounts of information in this open media environment (Groes, 2017). Bozdag (2013) proposes that digital media infiltrates both private and public spheres, users’s activities such as communication, shopping, and sharing generate extensive data traces. Thus, information overload in the increasingly popular digital society poses challenges and problems. Groes (2017) expresses that information overload has led to symptoms of technostress conditions in the digital age, such as data addiction and infostress. With the emergence of vast amount of information, the boundaries of its interpretation become increasingly blurred, making it difficult for people to understand of information overload (Groes, 2017). For instance, the vast volume of information can overwhelm individuals, leading to attention diversion, difficulty in filtering, and reducing their receptivity to new concepts (Rodriguez et al., 2014). Moreover, it poses threats to personal well-being and democratic procedures (Bozdag, 2013). To address this challenge, social media platforms have implemented curation algorithms. In the contemporary’s vast flow of information, recommender algorithms play a vital role in limiting information overload by supporting users in discovering relevant content (Swart, 2021). Moreover, social media platforms present personalized content through algorithmic technology to attract users and improve user experience (Berman and Katona, 2020). Berman and Katona (2020) express that, unlike traditional media, social platforms aim to provide the ultimate source of customized and relevant content for users.

 

What is algorithm and filter bubbles

Pariser (2011) describes such algorithm as a mechanism for acquiring user preferences and customizing content accordingly. Over time, these algorithms shape user’s media environments to perfectly reflect user interests, representing a form of information determinism (Pariser, 2011).

Furthermore, Pariser (2011) proposes that filter bubbles are personalized information ecosystems created by algorithms that customization content based on past user behaviour. Specifically, the filter bubble is shaped by pre-selected personalization (Pariser, 2011). These bubbles emerge as users engage with content, providing more of the same based on past behaviour. For example, when the user clicks on a link to indicate interest, it creates a reinforcement loop that exposes the users to similar content (Pariser, 2011).

 

Personalization algorithmic in the digital era

The digital era has driven advances in algorithms and data analytics, creating increasingly personalized online services. In this personalized environment, users can selectively connect with like-minded friends. The drive of personalized algorithms is contributing users to seek out people with common interests and viewpoints. Furthermore, algorithms can filter high-quality and desirable content for users (Berman and Katona, 2020), thereby influencing their consumption habits. Borgesius et al. (2016) highlight two primary methods of personalizing user content and experience: self-selected and pre-selected personalization. These methods represent different approaches to implementing personalized algorithms.

Self-selected personalization refers to the process where individuals actively engage or select experiences that align with their interests and personal views (Borgesius et al., 2016). Users customize their experience by selecting settings or features to enhance their satisfaction and engagement with the platform or service. They tend to avoid information that contradicts their viewpoints and prefer information that aligns with their personal preferences and interests. For example, users can influence platform algorithms’ decisions through implicit actions, such as adjusting browsing behaviours (Swart, 2021). Borgesius et al. (2016) highlight that this behaviour is conceptualized as selective exposure in communication science. This approach enhances users’ control, allowing them to manipulate the content they consume by explicitly expressing their preferences and behaviours.

In addition, pre-selected personalization refers to options or settings that are pre-driven by websites, advertisers, or systems based on user preferences or characteristics (Borgesius et al., 2016).  This form of customization does not involve the user’s explicit choice or individual awareness (Borgesius et al., 2016), requiring no active input or selection form users. For example, websites may pre-select product recommendations for the user based on their past purchase records on a platform. In contrast to self-selected personalization, users have limited control over the content they receive, leading to a decrease in receiving diverse experiences. While some users consciously select this customization, others are unaware, such as Facebook’s personalized newsfeed (Borgesius et al., 2016).

 

Echo chambers

In the digital era, online platforms (e.g., social media platforms) utilize algorithms to provide personalized content to users, enhancing user experience. However, this seemingly beneficial customization leads to the creation of echo chambers in these platforms. Cinelli et al. (2021) propose that algorithms on social media platforms restrict users’ exposure to diverse viewpoints, promote the development of echo chambers characterized by sharing and reinforcing familiar narratives within homogeneous groups. These echo chambers create an environment where users repeatedly receive information consistent with their concepts, reinforcing their views over time. The echo chamber effect is persistently maintained due to the user’s individual preferences and the platform’s algorithms.

 

The challenge of personalization algorithmic for information accessibility

With the support of digital technology, online platforms utilize personalized algorithmic techniques to address information overload (Swart, 2021). However, algorithmic personalization presents both opportunities and challenges. This trend of algorithmic personalization reinforces users’ existing biases, reducing exposure to diverse ideas (Berman and Katona, 2020). Furthermore, Berman and Katona (2020) propose that content polarization between self-selected personalization and pre-selected personalization reinforces concerns about reducing challenging information received.

Dahlgren (2021) suggests that user preferences and platform algorithms are the primary factors influencing exposure to challenging information. In user preferences, confirmation bias leads users to seek content that aligns with their worldview, avoiding exposure to challenging information (Dahlgren, 2021). Individuals tend to select information that reinforces with their existing views and avoid conflict in social interactions (Dahlgren, 2021). This selective search for supportive information makes social media content more homogeneous, reducing involvement of challenging or dissenting viewpoints.

In platform algorithms, Bozdag (2013) expresses that personalization algorithms regulate incoming and outgoing information, filtering content for users and determining its visibility to others. The personalization algorithms tend to prioritize content that aligns with user preferences, thus reinforcing users’ existing beliefs and limiting exposure to diverse or challenging information (Dahlgren, 2021). The multifaceted impact of algorithms on exposure to challenging information varies depending on the specific platform and personalization methods, potentially contributing in the formation of filter bubbles and echo chambers.

According to Berman and Katona (2020), algorithms inadvertently generate filter bubbles as they prioritize content based on users’ past interaction behaviours (e.g., browsing, likes). The creation of filter bubbles restricts users from receiving information that aligns with their beliefs, thereby exacerbating the polarization of society. Moreover, personalized algorithms contribute to the transformation of filter bubbles into personalized information, making it challenging to access challenging information (Dahlgren, 2021). These processes often reinforce each other, encouraging users to avoid actively challenging content to their viewpoints.

 

Exploring the complexities of social media algorithms and their impact on user behaviour

Social media algorithms play a crucial role in shaping user behaviour by influencing the content they encounter and online behaviour. Dahlgren (2021) argues that these assumptions oversimplify the impact of technology and user behaviour, conflating technological determinism with strong behaviourism and overlooking the distinction between preferences and choices. Algorithms are viewed as damaging because they attempt to manipulate users by directing them towards specific content, thus intensifying their dependence on the platform (Ionescu and Licu, 2023). As social media content frequently interacts with users’ social circles, algorithmic recommendations also impact users’ perceptions and opinions of other individuals within their social circles. Furthermore, under the mutual driving and interaction of filter bubbles and algorithms, a feedback loop is created (Dahlgren, 2021). This cycle gradually narrow down the content users receive, influencing their future preferences and viewpoint (Dahlgren, 2021). Due to the opacity of algorithms, users can only perceive their effects based on their perceptions (Ionescu and Licu, 2023). Hence, algorithms deeply affect users’ behaviours and cognitions on social media platforms, leading to negative outcomes.

 

TikTok’s algorithmic impact on user engagement and experience

Social media platforms, particularly TikTok, have become integral parts of people’s lives, garnering high levels of engagement. TikTok is known as the most addictive platform, with over 1 billion users monthly (Ionescu and Licu, 2023). TikTok users exhibit significant engagement on social media platforms. Ionescu and Licu (2023) propose that individual motivations and behaviours significant shapes user’s TikTok experience. When combined with algorithm-driven personalized content aligned with users’ values, this interaction can significantly enhance positive experiences (Ostic et al., 2021; Naslund et al., 2020). Social media algorithms adeptly incorporate data into customized content that appeals to user preferences, such as the “For You Page” on TikTok (Bucher, 2020). TikTok has a highly advanced and complex algorithmic system, particularly regarding user engagement, content delivery, and interaction variety (Smith and Short, 2022). However, this complicate system exacerbates addiction issues, especially among younger population (Smith and Short, 2022), reinforcing TikTok’s reputation as the most addictive platform (Ionescu and Licu, 2023).

 

Conclusion

This article examines the phenomenon of information overload caused by the expansion of digital technologies and the widespread adoption of the Internet as a background. Due to the overwhelming of information, people’s receptivity to information has significantly decreased. Online platforms like social media platforms address this challenge by implementing personalized algorithms. The emergence of algorithmic technology aims to enhance user experience and satisfaction. However, personalized algorithms inadvertently contribute to the forming of echo chambers and filter bubbles, limiting the exposure of different viewpoints. In addition, algorithms have tremendous power to shape user behaviour and cognition, thereby exacerbating the cognitive biases caused by addiction issues. This paper uses TikTok as a case study to narrate the complex workings of its platform, highlighting the interaction between algorithms and users. Specifically, this article discusses the multifaceted challenges posed by information overload and algorithmic influence in the digital age.

 

References

Balaji, T., Annavarapu, C. S. R., & Bablani, A. (2021). Machine learning algorithms for social media analysis: A survey. Computer Science Review, 40, 100395.

Berman, R., & Katona, Z. (2020). Curation algorithms and filter bubbles in social networks. Marketing Science, 39(2), 296-316.

Borgesius, F. J. Z., Trilling, D., Möller, J., Bodó, B., De Vreese, C. H., & Helberger, N. (2016). Should we worry about filter bubbles? Internet policy review.

 Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and information technology, 15, 209-227.

Bucher, T. (2020). Nothing to disconnect from? Being singular plural in an age of machine learning. Media, Culture & Society, 42(4), 610-617.

Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118.

Dahlgren, P. M. (2021). A critical review of filter bubbles and a comparison with selective exposure. Nordicom Review, 42(1), 15-33.

Groes, S. (2017). Information overload in literature. Textual Practice, 31(7), 1481-1508.

Ionescu, C. G., & Licu, M. (2023). Are TikTok Algorithms Influencing Users’ Self-Perceived Identities and Personal Values? A Mini Review. Social Sciences, 12(8), 465.

Naslund, J. A., Bondre, A., Torous, J., & Aschbrenner, K. A. (2020). Social media and mental health: benefits, risks, and opportunities for research and practice. Journal of technology in behavioral science, 5, 245-257.

Ostic, D., Qalati, S. A., Barbosa, B., Shah, S. M. M., Galvan Vela, E., Herzallah, A. M., & Liu, F. (2021). Effects of social media use on psychological well-being: a mediated model. Frontiers in Psychology, 12, 678766.

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. penguin UK.

Rodriguez, M. G., Gummadi, K., & Schoelkopf, B. (2014). Quantifying information overload in social media and its impact on social contagions. Proceedings of the international AAAI conference on web and social media, 8, 170-179.

Smith, T., & Short, A. (2022). Needs affordance as a key factor in likelihood of problematic social media use: Validation, latent Profile analysis and comparison of TikTok and Facebook problematic use measures. Addictive Behaviors, 129, 107259.

Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media+ Society, 7(2), 20563051211008828.


Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XV Feedback Form.

Comments

10 responses to “Addressing Information Overload and Algorithmic Influence In Digital Era”

  1. dylanbradshaw Avatar
    dylanbradshaw

    Hey there, thought the article was really well written! Especially about the overwhelming amount of confirmation bias a person can receive in a day. This constant fulfilling of a persons own views and values has to leave a person onenoted, no? Do you think new generations of children will be given a chance to develop their own personalities and interests or do you think it will be predetermined by a social media platform if they are given access early enough. On a related note, what do you think of so called “Ipad kids”, i know this isnt an article about parenting but i think it’s an important issue.

    Would love to hear your thoughts!

    1. Sanna Avatar
      Sanna

      Hi Dylan,

      Thank you for your appreciation of the paper. The issues you raise are very important, especially when it comes to the impact on the younger generation.

      In my paper, I discussed how social platform algorithms inadvertently reinforce individuals’ existing views and values, which can lead to the creation of filter bubbles and echo chambers. For the new generation of children, I believe the influence of digital technology and social media platforms is evident, especially with personalized algorithms providing users with the amount of customized content. However, I believe this is not the only factor. Family, educational and social environments also play significant roles in shaping children’s personalities and interests. By actively interacting with children as they grow up, guiding them to recognize different information, explore diverse interests and develop critical thinking skills, adults can help reduce the potential negative impact of algorithm-driven content.

      As for the phenomenon of “iPad kids”, it is indeed a matter of concern. While digital technology can provide children with rich learning and entertainment opportunities, over-reliance on digital devices may have a negative effect on their socialisation and emotional well-being. Therefore, we need to find a balance and encourage children to strike a healthy balance between the digital and real worlds, providing opportunities for offline activities and social interactions.

      At last, I believe that addressing these issues requires collective efforts from families, schools, governments and social media platforms. While social media platforms may have some impact on a child’s development, we must also recognise that they are only part of the overall ecosystem. By creating a supportive and diverse environment, we can help children develop their own personalities and interests in the digital age.

      Once again, thank you for your comment, and I apologise for the delayed response.

      Looking forward to further discussion on these important issues!

  2. Jess Wilson Avatar
    Jess Wilson

    Hi Sanna, I thoroughly enjoyed reading your paper as I found your knowledge on social media algorithms very insightful. I learned a lot from your paper, especially about how the TikTok algorithm works and how there are two separate algorithmic personalization systems. Before reading your paper, I knew I was not fond of Facebook’s algorithm, and now I know why because it uses a pre-selected algorithm system, hahaha. I prefer the self-selected algorithm system like Instagram, which shows me content based on what I have previously interacted with on the platform. After reading your paper and your examples on TikTok, it sounds a lot like how the TikTok algorithm works. I think I’m right in saying this? I’m not much of a TikTok user. Once again great job on your paper as I learned a lot from your added knowledge on the subject. Jess

    1. Sanna Avatar
      Sanna

      Hi Jess,

      Thank you very much for your comment! I’m glad you recognised my paper and learned something new from it.

      Your observation is very accurate, TikTok uses an algorithmic personalisation system to ensure that users have access to the content they are interested in.TikTok’s algorithm is an interesting topic that utilises user behaviour and preferences to personalise the content recommendations, thus providing a personalised user experience. Even if you are not a frequent user of TikTok, you have a good understanding of its algorithm!

      Your preference for Instagram’s self-selected algorithm system is also interesting, and I completely understand your perspective, as it delivers content with personalisation by analysing user interaction behaviour. In contrast, Facebook utilises a pre-selected algorithm system, which can limit the user experience.

      However, I believe we need to be more cautious in examining the personalised functions of social media algorithms. While they aim to provide personalised content, excessive personalisation may lead to the formation of filter bubbles and echo chambers, thereby restricting users’ exposure to diverse viewpoints. Therefore, we need to maintain openness and acceptance of diverse perspectives.

      If you have any further questions or would like to learn more, please feel free to let me know.

      Once again, thank you for your support and feedback, and I apologise for the delayed response.

  3. El Ashcroft Avatar
    El Ashcroft

    Interesting read. You’ve discussed some good points about the positives of algorithms creating filter bubbles and echo chambers to help users avoid things like attention diversion, difficulty in filtering, and information anxiety. You’re point about users having less direct control over the content they receive is a worry though. Not too bad if you love reading or something but not great if you have racist tendencies.

    You discuss how algorithms create filter bubbles and echo chambers based on content users’ preferences and actions and this usually confirms their opinions however do you have any thoughts on users that might end up in a filter bubble or content that goes against their opinion because they’ve commented on content they disagree with? For example, as someone who actively disputed misinformation related to the Indigenous Voice to Parliament, I commented on a lot of posts from the no camp, this led to me seeing more posts from the no camp in my feed rather than from the yes camp. Because of the algorithm I found myself in an filter bubble of “no” information and while I didn’t fall into the trap of believing the information that was circulating, I wonder how many people who were on the fence many have been influenced by what the algorithm feed them.

    If you wouldn’t mind could you take a look at my paper? https://networkconference.netstudies.org/2024/onsc/3578/how-yes-and-no-supporters-used-social-media-to-influence-the-indigenous-voice-to-parliament-vote/

    1. Sanna Avatar
      Sanna

      Hi El,

      Thank you for your insightful feedback, I appreciate it.

      You’ve raised some thought-provoking points about the dual nature of algorithms in creating filter bubbles and echo chambers. This is a legitimate concern. While it is true that filter bubbles and echo chambers can help reduce issues like attention diversion and information anxiety, the lack of direct control over the content we receive is indeed worrying, especially when individuals may be exposed to harmful or divisive content.

      Your personal experience highlights the potential trap of individuals being inadvertently restricted to algorithmic filter bubbles, even when we are actively engaging with opposing viewpoints. You’ve raised an important question about how many people might be influenced by the content they’re exposed to through algorithms, particularly when they’re undecided or on the fence about a specific issue. It is an urgent issue and reminds us that algorithms can have a significant impact on shaping our perspectives and potentially influencing opinions, whether consciously or unconsciously. Therefore, we must remain alert and critically evaluate the information we encounter, especially in an era where algorithms play such a crucial role in shaping our digital experiences.

      I’m interested in your paper and would be happy to take a look.

      Thanks again for sharing your insights, and I look forward to reading your paper.

    2. Sanna Avatar
      Sanna

      Hi EI,

      I wanted to share a paper with you that I find very relevant to our discussion topic. This article thoroughly explores various aspects of social media and its algorithms’ impact on user experience and society. It highlights the harmful effects of filter bubbles and echo chambers on critical thinking and democracy, as well as their role in spreading extreme views and misinformation. Besides that, the article also suggests some response strategies, including individual monitoring of social media use to avoid the negative effects of filter bubbles and echo chambers. I believe you’ll be interested in the insights and analyses presented in it, and perhaps it can help us understand the topic more deeply.

      The example you mentioned about inadvertently getting into filter bubbles by interacting with opposing viewpoints aligns perfectly with the article’s discussion of how algorithms filter content based on user preferences and behaviours, creating “filter bubbles” that pose challenges to broad discussions on social and political issues.

      Here’s the link to the paper:
      HOW ALGORITHMS AND CORPORATE GREED ARE CREATING UNHEALTHY ONLINE ENVIRONMENTS
      https://networkconference.netstudies.org/2024/csm/3607/the-evolution-of-social-media-and-its-impact-on-society/

      I hope you enjoy reading this article and look forward to discussing some of its viewpoints with you.

      Thank you!

  4. Zac Reed Avatar
    Zac Reed

    I really enjoyed your paper, its very relevant to the how social media seems to consume us nowadays. I really like the term ‘information overload’ that you use. It feels like wherever you look in media nowadays whether its social media, film etc you tend to drown in political and pop culture information that you weren’t particularly looking for.

    You mentioned ‘reduced receptivity to novel concepts’ in relation to how algorithms tend to bombard us with information it thinks we want to see. This is something I also explore in my paper, I think you might enjoy it; https://networkconference.netstudies.org/2024/csm/3607/the-evolution-of-social-media-and-its-impact-on-society/

    1. Sanna Avatar
      Sanna

      Hi Zac,

      Thank you very much for your thoughtful feedback!

      I’m glad you found the concept of “information overload” relevant and resonant with your own observations. Indeed, information overload is definitely a common issue in the contemporary media landscape. We are constantly bombarded with various information that often exceeds our expected interests. Do you find yourself actively managing this deluge of information, or is there any strategy for dealing with it?

      It sounds like we share similar concerns about the impact of algorithmic filtering on information consumption, particularly in the prevalence of algorithmic filtering. I’m also interested in looking at your paper to explore these themes further.

      Thanks again for your response, and I’ll be sure to check out your paper.

    2. Sanna Avatar
      Sanna

      Hi Zac,

      Thank you so much for sharing this article. Your exploration and insights into how social media algorithms shape filter bubbles and echo chambers, and further influence users’ political positions and behaviours, have impressed and benefited on me.

      Your discussion of the concept of the “attention economy” and how algorithms push content to create filter bubbles and echo chambers is well aligned with my own research. As you point out, companies use intrusive algorithms to compete for users’ online attention by constantly pushing content that attracts their attention, which further contributes to information overload and the impact of personalized algorithms. This phenomenon limits users’ access to diverse perspectives on social and political issues, leading to polarising critical thinking and perpetuating the echo chamber.

      I particularly note your discussion about the impact of short video platforms like TikTok on human attention span. In particular, you mentioned Mark’s research regarding the decrease in attention span over the past 25 decades. Combined with my personal experience, I also find myself often caught in a cycle of constantly scrolling through videos when using the TikTok platform. The system’s algorithms provide customized video content that is always relevant to my interests, such as pet video about British shorthair cats, or the K-pop song I’ve recently become obsessed with, “like that”. This content keeps me scrolling down and exposes me to the information I prefer while filtering out the content I’m not interested in. This maximises my attention and sometimes leads me addicted to the platform, which has had a certain impact on me.

      I’m curious to know more about how we should address with the competition for attention on social media platforms, to help users better manage their attention and reduce the impact of filter bubbles and echo chambers on us?

      Once again, thank you for sharing such an interesting paper!

Leave a Reply

Skip to content