Introduction
In the contemporary digital era, the advances in digital technology have revolutionized how people communicate. The widespread availability of the Internet has changed the way information is accessed, enabling people effortless access to a wide range of information from anywhere. Balaji et al. (2021) propose that textual communication has emerged as the primary mode of interaction, with over 18.2 million messages transmitted every minute in this dynamic era. However, the rapid development of digital technology has led to the phenomenon of information overload. As people explore the vast amount of online content, the challenge of filtering out relevant information becomes increasingly overwhelming.
To address the challenge of information overload, social media platforms have turned to personalize algorithms solutions to address and enhance user experiences. These algorithms analyze user data and behaviour to develop specific content for the user through self-selected or pre-selected personalization methods (Borgesius et al., 2016). However, personalize algorithms constrict users’ exposure to diverse content by strengthening their existing beliefs and perspectives (Berman and Katona, 2020). This situation in algorithmic curation extends to the emergence of the echo chamber and contributes its growth (Cinelli et al., 2021). Furthermore, the complexity surrounding algorithmic influence extend beyond mere personalization, and its ability to manipulate user behaviour and exacerbate addiction issues has raises concerns (Smith and Short, 2022). Among these challenges, social media platforms like TikTok are prominent examples of algorithmic-driven engagement, featuring advanced systems customised to individual preferences (Bucher, 2020).
This paper focuses on exploring concepts related to information overload, algorithmic personalization, echo chambers, and filter bubbles and their impact on user behaviour and cognition in digital environments. Its aim is to highlight the complexity of algorithmic influence in the digital era and emphasize the importance of critical evaluation in digital media consumption.
Addressing information overload online in the digital era
The widespread adoption of the Internet and the rapid development of digital technology has created a contemporary dynamic era. Individuals have the freedom to access and gather vast amounts of information in this open media environment (Groes, 2017). Bozdag (2013) proposes that digital media infiltrates both private and public spheres, users’s activities such as communication, shopping, and sharing generate extensive data traces. Thus, information overload in the increasingly popular digital society poses challenges and problems. Groes (2017) expresses that information overload has led to symptoms of technostress conditions in the digital age, such as data addiction and infostress. With the emergence of vast amount of information, the boundaries of its interpretation become increasingly blurred, making it difficult for people to understand of information overload (Groes, 2017). For instance, the vast volume of information can overwhelm individuals, leading to attention diversion, difficulty in filtering, and reducing their receptivity to new concepts (Rodriguez et al., 2014). Moreover, it poses threats to personal well-being and democratic procedures (Bozdag, 2013). To address this challenge, social media platforms have implemented curation algorithms. In the contemporary’s vast flow of information, recommender algorithms play a vital role in limiting information overload by supporting users in discovering relevant content (Swart, 2021). Moreover, social media platforms present personalized content through algorithmic technology to attract users and improve user experience (Berman and Katona, 2020). Berman and Katona (2020) express that, unlike traditional media, social platforms aim to provide the ultimate source of customized and relevant content for users.
What is algorithm and filter bubbles
Pariser (2011) describes such algorithm as a mechanism for acquiring user preferences and customizing content accordingly. Over time, these algorithms shape user’s media environments to perfectly reflect user interests, representing a form of information determinism (Pariser, 2011).
Furthermore, Pariser (2011) proposes that filter bubbles are personalized information ecosystems created by algorithms that customization content based on past user behaviour. Specifically, the filter bubble is shaped by pre-selected personalization (Pariser, 2011). These bubbles emerge as users engage with content, providing more of the same based on past behaviour. For example, when the user clicks on a link to indicate interest, it creates a reinforcement loop that exposes the users to similar content (Pariser, 2011).
Personalization algorithmic in the digital era
The digital era has driven advances in algorithms and data analytics, creating increasingly personalized online services. In this personalized environment, users can selectively connect with like-minded friends. The drive of personalized algorithms is contributing users to seek out people with common interests and viewpoints. Furthermore, algorithms can filter high-quality and desirable content for users (Berman and Katona, 2020), thereby influencing their consumption habits. Borgesius et al. (2016) highlight two primary methods of personalizing user content and experience: self-selected and pre-selected personalization. These methods represent different approaches to implementing personalized algorithms.
Self-selected personalization refers to the process where individuals actively engage or select experiences that align with their interests and personal views (Borgesius et al., 2016). Users customize their experience by selecting settings or features to enhance their satisfaction and engagement with the platform or service. They tend to avoid information that contradicts their viewpoints and prefer information that aligns with their personal preferences and interests. For example, users can influence platform algorithms’ decisions through implicit actions, such as adjusting browsing behaviours (Swart, 2021). Borgesius et al. (2016) highlight that this behaviour is conceptualized as selective exposure in communication science. This approach enhances users’ control, allowing them to manipulate the content they consume by explicitly expressing their preferences and behaviours.
In addition, pre-selected personalization refers to options or settings that are pre-driven by websites, advertisers, or systems based on user preferences or characteristics (Borgesius et al., 2016). This form of customization does not involve the user’s explicit choice or individual awareness (Borgesius et al., 2016), requiring no active input or selection form users. For example, websites may pre-select product recommendations for the user based on their past purchase records on a platform. In contrast to self-selected personalization, users have limited control over the content they receive, leading to a decrease in receiving diverse experiences. While some users consciously select this customization, others are unaware, such as Facebook’s personalized newsfeed (Borgesius et al., 2016).
Echo chambers
In the digital era, online platforms (e.g., social media platforms) utilize algorithms to provide personalized content to users, enhancing user experience. However, this seemingly beneficial customization leads to the creation of echo chambers in these platforms. Cinelli et al. (2021) propose that algorithms on social media platforms restrict users’ exposure to diverse viewpoints, promote the development of echo chambers characterized by sharing and reinforcing familiar narratives within homogeneous groups. These echo chambers create an environment where users repeatedly receive information consistent with their concepts, reinforcing their views over time. The echo chamber effect is persistently maintained due to the user’s individual preferences and the platform’s algorithms.
The challenge of personalization algorithmic for information accessibility
With the support of digital technology, online platforms utilize personalized algorithmic techniques to address information overload (Swart, 2021). However, algorithmic personalization presents both opportunities and challenges. This trend of algorithmic personalization reinforces users’ existing biases, reducing exposure to diverse ideas (Berman and Katona, 2020). Furthermore, Berman and Katona (2020) propose that content polarization between self-selected personalization and pre-selected personalization reinforces concerns about reducing challenging information received.
Dahlgren (2021) suggests that user preferences and platform algorithms are the primary factors influencing exposure to challenging information. In user preferences, confirmation bias leads users to seek content that aligns with their worldview, avoiding exposure to challenging information (Dahlgren, 2021). Individuals tend to select information that reinforces with their existing views and avoid conflict in social interactions (Dahlgren, 2021). This selective search for supportive information makes social media content more homogeneous, reducing involvement of challenging or dissenting viewpoints.
In platform algorithms, Bozdag (2013) expresses that personalization algorithms regulate incoming and outgoing information, filtering content for users and determining its visibility to others. The personalization algorithms tend to prioritize content that aligns with user preferences, thus reinforcing users’ existing beliefs and limiting exposure to diverse or challenging information (Dahlgren, 2021). The multifaceted impact of algorithms on exposure to challenging information varies depending on the specific platform and personalization methods, potentially contributing in the formation of filter bubbles and echo chambers.
According to Berman and Katona (2020), algorithms inadvertently generate filter bubbles as they prioritize content based on users’ past interaction behaviours (e.g., browsing, likes). The creation of filter bubbles restricts users from receiving information that aligns with their beliefs, thereby exacerbating the polarization of society. Moreover, personalized algorithms contribute to the transformation of filter bubbles into personalized information, making it challenging to access challenging information (Dahlgren, 2021). These processes often reinforce each other, encouraging users to avoid actively challenging content to their viewpoints.
Exploring the complexities of social media algorithms and their impact on user behaviour
Social media algorithms play a crucial role in shaping user behaviour by influencing the content they encounter and online behaviour. Dahlgren (2021) argues that these assumptions oversimplify the impact of technology and user behaviour, conflating technological determinism with strong behaviourism and overlooking the distinction between preferences and choices. Algorithms are viewed as damaging because they attempt to manipulate users by directing them towards specific content, thus intensifying their dependence on the platform (Ionescu and Licu, 2023). As social media content frequently interacts with users’ social circles, algorithmic recommendations also impact users’ perceptions and opinions of other individuals within their social circles. Furthermore, under the mutual driving and interaction of filter bubbles and algorithms, a feedback loop is created (Dahlgren, 2021). This cycle gradually narrow down the content users receive, influencing their future preferences and viewpoint (Dahlgren, 2021). Due to the opacity of algorithms, users can only perceive their effects based on their perceptions (Ionescu and Licu, 2023). Hence, algorithms deeply affect users’ behaviours and cognitions on social media platforms, leading to negative outcomes.
TikTok’s algorithmic impact on user engagement and experience
Social media platforms, particularly TikTok, have become integral parts of people’s lives, garnering high levels of engagement. TikTok is known as the most addictive platform, with over 1 billion users monthly (Ionescu and Licu, 2023). TikTok users exhibit significant engagement on social media platforms. Ionescu and Licu (2023) propose that individual motivations and behaviours significant shapes user’s TikTok experience. When combined with algorithm-driven personalized content aligned with users’ values, this interaction can significantly enhance positive experiences (Ostic et al., 2021; Naslund et al., 2020). Social media algorithms adeptly incorporate data into customized content that appeals to user preferences, such as the “For You Page” on TikTok (Bucher, 2020). TikTok has a highly advanced and complex algorithmic system, particularly regarding user engagement, content delivery, and interaction variety (Smith and Short, 2022). However, this complicate system exacerbates addiction issues, especially among younger population (Smith and Short, 2022), reinforcing TikTok’s reputation as the most addictive platform (Ionescu and Licu, 2023).
Conclusion
This article examines the phenomenon of information overload caused by the expansion of digital technologies and the widespread adoption of the Internet as a background. Due to the overwhelming of information, people’s receptivity to information has significantly decreased. Online platforms like social media platforms address this challenge by implementing personalized algorithms. The emergence of algorithmic technology aims to enhance user experience and satisfaction. However, personalized algorithms inadvertently contribute to the forming of echo chambers and filter bubbles, limiting the exposure of different viewpoints. In addition, algorithms have tremendous power to shape user behaviour and cognition, thereby exacerbating the cognitive biases caused by addiction issues. This paper uses TikTok as a case study to narrate the complex workings of its platform, highlighting the interaction between algorithms and users. Specifically, this article discusses the multifaceted challenges posed by information overload and algorithmic influence in the digital age.
References
Balaji, T., Annavarapu, C. S. R., & Bablani, A. (2021). Machine learning algorithms for social media analysis: A survey. Computer Science Review, 40, 100395.
Berman, R., & Katona, Z. (2020). Curation algorithms and filter bubbles in social networks. Marketing Science, 39(2), 296-316.
Borgesius, F. J. Z., Trilling, D., Möller, J., Bodó, B., De Vreese, C. H., & Helberger, N. (2016). Should we worry about filter bubbles? Internet policy review.
Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and information technology, 15, 209-227.
Bucher, T. (2020). Nothing to disconnect from? Being singular plural in an age of machine learning. Media, Culture & Society, 42(4), 610-617.
Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118.
Dahlgren, P. M. (2021). A critical review of filter bubbles and a comparison with selective exposure. Nordicom Review, 42(1), 15-33.
Groes, S. (2017). Information overload in literature. Textual Practice, 31(7), 1481-1508.
Ionescu, C. G., & Licu, M. (2023). Are TikTok Algorithms Influencing Users’ Self-Perceived Identities and Personal Values? A Mini Review. Social Sciences, 12(8), 465.
Naslund, J. A., Bondre, A., Torous, J., & Aschbrenner, K. A. (2020). Social media and mental health: benefits, risks, and opportunities for research and practice. Journal of technology in behavioral science, 5, 245-257.
Ostic, D., Qalati, S. A., Barbosa, B., Shah, S. M. M., Galvan Vela, E., Herzallah, A. M., & Liu, F. (2021). Effects of social media use on psychological well-being: a mediated model. Frontiers in Psychology, 12, 678766.
Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. penguin UK.
Rodriguez, M. G., Gummadi, K., & Schoelkopf, B. (2014). Quantifying information overload in social media and its impact on social contagions. Proceedings of the international AAAI conference on web and social media, 8, 170-179.
Smith, T., & Short, A. (2022). Needs affordance as a key factor in likelihood of problematic social media use: Validation, latent Profile analysis and comparison of TikTok and Facebook problematic use measures. Addictive Behaviors, 129, 107259.
Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media+ Society, 7(2), 20563051211008828.
Leave a Reply
You must be logged in to post a comment.