Abstract
This paper addresses the question: What effect do filter bubbles created by social media platforms have on adolescents in Australia? As the use of platforms like TikTok, Instagram, and YouTube continues to grow rapidly, it is crucial to ensure that these digital spaces remain safe and supportive environments for young people. This paper explores how filter bubbles—created by social media algorithms that personalise content—are shaping the beliefs and behaviours of Australian adolescents by limiting their exposure to diverse perspectives. It argues that these online environments contribute to the formation of echo chambers, which negatively impact adolescents in the modern era.
Introduction
In today’s world, social media is everywhere—and it influences us in ways we often don’t even realise. Australians now spend an average of 1 hour and 51 minutes per day on social media, which amounts to nearly half of the five hours and 13 minutes of free time we typically have each day (ACS, 2025). This constant engagement is especially common among adolescents, who are the most active age group online. Young Australians aged 14 to 24 are spending hundreds of minutes per week on social media, with young women averaging 822 minutes and young men 528 minutes (ROI, 2022). Every major social media platform—particularly TikTok, Instagram, and YouTube—relies on algorithms to personalise the content users see. These algorithms are designed to track our behaviour, such as what we like, share, or watch, and then recommend more of the same type of content. According to the Digital Marketing Institute, an algorithm is “a mathematical set of rules specifying how a group of data behaves,” and in social media, it’s used to sort, rank, and deliver content specified to each individual (DMI, 2024). While this may seem helpful, it has a hidden cost. These algorithms create filter bubbles, which are digital spaces where users are only exposed to information and perspectives that match their existing interests and beliefs. As explained by GCF Global, “Being in a filter bubble means these algorithms have isolated you from information and perspectives you haven’t already expressed an interest in” (GCF Global, 2024). For adolescents, this isolation leads to echo chambers—environments where repeated exposure to the same ideas limits their understanding of the world and reduces opportunities to learn from different points of view. This paper argues that filter bubbles created by social media platforms are shaping the beliefs and behaviours of adolescents in Australia by limiting their exposure to diverse perspectives. As social media continues to dominate young people’s lives, it is essential to understand how these digital echo chambers— especially those found on TikTok, Instagram, and YouTube—are influencing their values, behaviours, and development.
Section 1
Believe it or not, all social media platforms today use algorithms that control what content appears on your feed. Think about the last TikTok you watched or the most recent Instagram post you saw—those posts were specifically selected to reach your page based on your behaviour, whether you realised it or not. Although platforms use different algorithms, they all rely on machine learning and a set of factors known as ranking signals, which help determine the value of each piece of content for each user at a particular moment in time (Hootsuite, 2024). These systems are designed to increase the amount of time users spend on the platform by continually offering content that is most likely to keep them engaged. This personalised content delivery creates a more enjoyable and immersive experience, which encourages people to stay on the platform longer (Sprout Social, 2024). For adolescents, this becomes especially influential. The content they see is carefully tailored to trigger emotional responses—often happiness or excitement—which leads to a dopamine hit that makes them want to keep scrolling. As mentioned earlier, Australian adolescents aged 14 to 24 are some of the heaviest users of social media, with young women averaging 822 minutes per week and young men 528 minutes (ROI, 2022). The more they engage, the more the algorithm feeds them similar content based on what they like or interact with. While this might sound harmless, it actually restricts what adolescents are exposed to. ABC News (2024) explains that these algorithms can narrow the type of content users see, reducing their exposure to different perspectives or ideas. This is how filter bubbles form—when users are continually shown content that aligns with their existing beliefs, while alternative or opposing viewpoints are filtered out. This narrowing of content leads to the creation of echo chambers, which are online spaces where users mostly encounter views that reflect and reinforce their own. According to GCF Global (2024), this can result in misinformation and distort a user’s ability to understand or consider other perspectives. For adolescents still forming their identities and worldviews, these echo chambers can be especially harmful, as they limit critical thinking and empathy. In the end, filter bubbles and echo chambers— driven by algorithmic systems—are influencing how adolescents in Australia think, act, and engage with the world. This supports the argument that social media platforms are not just neutral tools but powerful forces shaping adolescent beliefs by restricting their exposure to diverse perspectives.
Section 2
Social media has become a dominant force in the daily lives of young people, with usage steadily increasing over the years. Recent data shows that 92% of teens aged 15 to 16, 59% of children aged 11 to 12, and even 29% of children aged 9 to 10 actively use social media (Hotglue, 2024). This growing engagement has raised national concern, with some Australian politicians calling for stronger regulation. Independent Senator David Pocock has publicly expressed worry about the addictive nature of social media, highlighting the ways in which these platforms are impacting children’s development. Teachers and parents have talked about these concerns, reporting that social media is eroding attention spans, interrupting classroom learning, and even damaging children’s mental wellbeing (Hotglue, 2024). A report by the eSafety Commissioner of Australia further revealed that YouTube, TikTok, Snapchat, and Facebook are the most widely used apps among adolescents (eSafety Commissioner, 2021). These platforms are not only popular but are also some of the most heavily driven by algorithmic content curation. TikTok’s “For You Page”, YouTube’s recommended videos, and Instagram Reels all use advanced machine learning to personalise what users see based on their past behaviour. This means adolescents are continually served similar types of content, which reinforces existing interests, opinions, and beliefs—sometimes without them even realising it. These algorithms are designed to be highly engaging, often triggering emotional or attention- grabbing content that keeps users scrolling longer. Because these apps are designed to maximise screen time, adolescents find themselves increasingly stuck in filter bubbles and echo chambers. The more time they spend on these platforms, the more their feeds become saturated with similar views and content styles. Over time, this shapes their perceptions of the world, reducing their exposure to differing ideas, alternative perspectives, and more balanced information. They begin to see the same opinions reflected back at them, which can distort their understanding of complex issues and limit their ability to think critically. In conclusion, the widespread use of social media among Australian adolescents—combined with algorithm-driven content and highly addictive platform designs—is playing a powerful role in shaping their beliefs and behaviours. These digital spaces are actively creating environments where young people are isolated in personalised echo chambers, making it increasingly difficult for them to access diverse and challenging perspectives. This directly supports the argument that filter bubbles are influencing adolescent development across Australia.
Section 3
Some people believe that social media users, especially adolescents, aren’t truly trapped in filter bubbles. They argue that users can choose to follow different accounts, change what they see online, or even stop using social media altogether. While that may be true in theory, these platforms are designed to make leaving or breaking out of these bubbles very difficult. Social media apps are built to keep people engaged for as long as possible by showing content that matches their interests and emotions. Hampton (2015) explains that platforms create persistent contact and pervasive awareness, keeping users constantly connected and aware of others online. This makes it harder for adolescents to step away, especially when their social lives and identities are so tied to what they see online. This constant connection also gives users a strong sense of community, but that can be misleading. Adolescents may feel like they belong to supportive online groups, but these groups can actually reinforce narrow views. Delanty (2018) points out that virtual communities are often built around shared lifestyles and interests. They feel personal and meaningful, but they also filter out different opinions. Even these communities can act as echo chambers. Hampton and Wellman (2018) also explain that while people often think earlier generations had better, stronger communities, today’s online networks create new kinds of connection—but they still come with limitations and risks, especially when it comes to reinforcing existing beliefs. So while adolescents technically can make choices online, the structure and design of social media make it extremely hard to escape filter bubbles. The sense of community and constant connection keeps young people stuck in content loops that reflect their existing views. This supports the idea that filter bubbles are built into the social platforms themselves and continue to shape limit adolescents in Australia exposure to diverse perspectives.
Conclusion
Throughout this paper, I have discussed how social media platforms—predominantly TikTok, Instagram, and YouTube—use algorithms that limit Australian adolescents’ exposure to a diverse range of perspectives. It is clear that these algorithms create filter bubbles, which isolate users by constantly feeding them similar types of content based on their past behaviour. As a result, adolescents are often drawn into echo chambers—online spaces where their existing views are repeated and reinforced, while opposing or unfamiliar ideas are filtered out. This is particularly concerning for adolescents, who are still developing. Their frequent use of these platforms, paired with the persuasive and addictive design of algorithm-driven content, means they are especially vulnerable to being shaped by what they see online. Despite the argument that users can control what they engage with, it is evident that these platforms are carefully designed to keep people engaged and to create a sense of community that is often misleading. These virtual communities may feel authentic, but they frequently operate as echo chambers themselves. As social media continues to play a central role in the lives of young Australians, it is essential to consider how platforms like TikTok, Instagram, and YouTube are limiting their exposure to diverse perspectives around them—and, in doing so, influencing how they see and engage with the world.

Hi Shannon Kate, You’re right to ask; it is incredibly difficult to police these issues today. Predatory behaviour isn’t exclusive…