Skip to content

The Harmful Impacts Of Social Media On Australian Adolescents


Abstract

This paper addresses the question: What effect do filter bubbles created by social media platforms have on adolescents in Australia? As the use of platforms like TikTok, Instagram, and YouTube continues to grow rapidly, it is crucial to ensure that these digital spaces remain safe and supportive environments for young people. This paper explores how filter bubbles—created by social media algorithms that personalise content—are shaping the beliefs and behaviours of Australian adolescents by limiting their exposure to diverse perspectives. It argues that these online environments contribute to the formation of echo chambers, which negatively impact adolescents in the modern era.

Introduction

In today’s world, social media is everywhere—and it influences us in ways we often don’t even realise. Australians now spend an average of 1 hour and 51 minutes per day on social media, which amounts to nearly half of the five hours and 13 minutes of free time we typically have each day (ACS, 2025). This constant engagement is especially common among adolescents, who are the most active age group online. Young Australians aged 14 to 24 are spending hundreds of minutes per week on social media, with young women averaging 822 minutes and young men 528 minutes (ROI, 2022). Every major social media platform—particularly TikTok, Instagram, and YouTube—relies on algorithms to personalise the content users see. These algorithms are designed to track our behaviour, such as what we like, share, or watch, and then recommend more of the same type of content. According to the Digital Marketing Institute, an algorithm is “a mathematical set of rules specifying how a group of data behaves,” and in social media, it’s used to sort, rank, and deliver content specified to each individual (DMI, 2024). While this may seem helpful, it has a hidden cost. These algorithms create filter bubbles, which are digital spaces where users are only exposed to information and perspectives that match their existing interests and beliefs. As explained by GCF Global, “Being in a filter bubble means these algorithms have isolated you from information and perspectives you haven’t already expressed an interest in” (GCF Global, 2024). For adolescents, this isolation leads to echo chambers—environments where repeated exposure to the same ideas limits their understanding of the world and reduces opportunities to learn from different points of view. This paper argues that filter bubbles created by social media platforms are shaping the beliefs and behaviours of adolescents in Australia by limiting their exposure to diverse perspectives. As social media continues to dominate young people’s lives, it is essential to understand how these digital echo chambers— especially those found on TikTok, Instagram, and YouTube—are influencing their values, behaviours, and development.

Section 1

Believe it or not, all social media platforms today use algorithms that control what content appears on your feed. Think about the last TikTok you watched or the most recent Instagram post you saw—those posts were specifically selected to reach your page based on your behaviour, whether you realised it or not. Although platforms use different algorithms, they all rely on machine learning and a set of factors known as ranking signals, which help determine the value of each piece of content for each user at a particular moment in time (Hootsuite, 2024). These systems are designed to increase the amount of time users spend on the platform by continually offering content that is most likely to keep them engaged. This personalised content delivery creates a more enjoyable and immersive experience, which encourages people to stay on the platform longer (Sprout Social, 2024). For adolescents, this becomes especially influential. The content they see is carefully tailored to trigger emotional responses—often happiness or excitement—which leads to a dopamine hit that makes them want to keep scrolling. As mentioned earlier, Australian adolescents aged 14 to 24 are some of the heaviest users of social media, with young women averaging 822 minutes per week and young men 528 minutes (ROI, 2022). The more they engage, the more the algorithm feeds them similar content based on what they like or interact with. While this might sound harmless, it actually restricts what adolescents are exposed to. ABC News (2024) explains that these algorithms can narrow the type of content users see, reducing their exposure to different perspectives or ideas. This is how filter bubbles form—when users are continually shown content that aligns with their existing beliefs, while alternative or opposing viewpoints are filtered out. This narrowing of content leads to the creation of echo chambers, which are online spaces where users mostly encounter views that reflect and reinforce their own. According to GCF Global (2024), this can result in misinformation and distort a user’s ability to understand or consider other perspectives. For adolescents still forming their identities and worldviews, these echo chambers can be especially harmful, as they limit critical thinking and empathy. In the end, filter bubbles and echo chambers— driven by algorithmic systems—are influencing how adolescents in Australia think, act, and engage with the world. This supports the argument that social media platforms are not just neutral tools but powerful forces shaping adolescent beliefs by restricting their exposure to diverse perspectives.

Section 2

Social media has become a dominant force in the daily lives of young people, with usage steadily increasing over the years. Recent data shows that 92% of teens aged 15 to 16, 59% of children aged 11 to 12, and even 29% of children aged 9 to 10 actively use social media (Hotglue, 2024). This growing engagement has raised national concern, with some Australian politicians calling for stronger regulation. Independent Senator David Pocock has publicly expressed worry about the addictive nature of social media, highlighting the ways in which these platforms are impacting children’s development. Teachers and parents have talked about these concerns, reporting that social media is eroding attention spans, interrupting classroom learning, and even damaging children’s mental wellbeing (Hotglue, 2024). A report by the eSafety Commissioner of Australia further revealed that YouTube, TikTok, Snapchat, and Facebook are the most widely used apps among adolescents (eSafety Commissioner, 2021). These platforms are not only popular but are also some of the most heavily driven by algorithmic content curation. TikTok’s “For You Page”, YouTube’s recommended videos, and Instagram Reels all use advanced machine learning to personalise what users see based on their past behaviour. This means adolescents are continually served similar types of content, which reinforces existing interests, opinions, and beliefs—sometimes without them even realising it. These algorithms are designed to be highly engaging, often triggering emotional or attention- grabbing content that keeps users scrolling longer. Because these apps are designed to maximise screen time, adolescents find themselves increasingly stuck in filter bubbles and echo chambers. The more time they spend on these platforms, the more their feeds become saturated with similar views and content styles. Over time, this shapes their perceptions of the world, reducing their exposure to differing ideas, alternative perspectives, and more balanced information. They begin to see the same opinions reflected back at them, which can distort their understanding of complex issues and limit their ability to think critically. In conclusion, the widespread use of social media among Australian adolescents—combined with algorithm-driven content and highly addictive platform designs—is playing a powerful role in shaping their beliefs and behaviours. These digital spaces are actively creating environments where young people are isolated in personalised echo chambers, making it increasingly difficult for them to access diverse and challenging perspectives. This directly supports the argument that filter bubbles are influencing adolescent development across Australia.

Section 3

Some people believe that social media users, especially adolescents, aren’t truly trapped in filter bubbles. They argue that users can choose to follow different accounts, change what they see online, or even stop using social media altogether. While that may be true in theory, these platforms are designed to make leaving or breaking out of these bubbles very difficult. Social media apps are built to keep people engaged for as long as possible by showing content that matches their interests and emotions. Hampton (2015) explains that platforms create persistent contact and pervasive awareness, keeping users constantly connected and aware of others online. This makes it harder for adolescents to step away, especially when their social lives and identities are so tied to what they see online. This constant connection also gives users a strong sense of community, but that can be misleading. Adolescents may feel like they belong to supportive online groups, but these groups can actually reinforce narrow views. Delanty (2018) points out that virtual communities are often built around shared lifestyles and interests. They feel personal and meaningful, but they also filter out different opinions. Even these communities can act as echo chambers. Hampton and Wellman (2018) also explain that while people often think earlier generations had better, stronger communities, today’s online networks create new kinds of connection—but they still come with limitations and risks, especially when it comes to reinforcing existing beliefs. So while adolescents technically can make choices online, the structure and design of social media make it extremely hard to escape filter bubbles. The sense of community and constant connection keeps young people stuck in content loops that reflect their existing views. This supports the idea that filter bubbles are built into the social platforms themselves and continue to shape limit adolescents in Australia exposure to diverse perspectives.

Conclusion

Throughout this paper, I have discussed how social media platforms—predominantly TikTok, Instagram, and YouTube—use algorithms that limit Australian adolescents’ exposure to a diverse range of perspectives. It is clear that these algorithms create filter bubbles, which isolate users by constantly feeding them similar types of content based on their past behaviour. As a result, adolescents are often drawn into echo chambers—online spaces where their existing views are repeated and reinforced, while opposing or unfamiliar ideas are filtered out. This is particularly concerning for adolescents, who are still developing. Their frequent use of these platforms, paired with the persuasive and addictive design of algorithm-driven content, means they are especially vulnerable to being shaped by what they see online. Despite the argument that users can control what they engage with, it is evident that these platforms are carefully designed to keep people engaged and to create a sense of community that is often misleading. These virtual communities may feel authentic, but they frequently operate as echo chambers themselves. As social media continues to play a central role in the lives of young Australians, it is essential to consider how platforms like TikTok, Instagram, and YouTube are limiting their exposure to diverse perspectives around them—and, in doing so, influencing how they see and engage with the world.

Share this:

Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XVI Feedback Form.

Comments

7 responses to “The Harmful Impacts Of Social Media On Australian Adolescents”

  1. pangi Avatar

    Hi Ben, your paper really caught my eye after watching ‘Adolescence’ on Netflix.

    I really liked how clearly you explained the impact of filter bubbles and echo chambers on young Australians. Your use of statistics and real-world examples made it easy to see how serious the issue is, especially with platforms like TikTok and Instagram.

    This connected with my own paper, where I looked at how content creators reinforce toxic masculinity among young men. Like you mentioned with filter bubbles, I also found that algorithms tend to push the same types of harmful content repeatedly, shaping how young people think about identity and relationships.

    As you probably already know after viewing my conference paper, some initiatives are trying to break through these bubbles by promoting more positive, diverse content on social media. Programs like The Man Cave and Tomorrow Man in Australia are good examples of this. They show that while algorithms are powerful, they can also be used to spread healthier messages if platforms and creators make a conscious effort.

    I’m curious about your thoughts – do you think schools or parents should play a bigger role in teaching adolescents how to spot when they’re stuck in a filter bubble? Or do you think the responsibility lies mainly with the platforms themselves?

    I really enjoyed reading this!

    1. ben.merendino Avatar

      Hey Pangi, thanks so much for the comment — I really appreciate it!

      That’s a cool connection to Adolescence on Netflix too, I’ve seen it and it definitely highlights a lot of what I was trying to get at in the paper. I also had a read of your work and thought it tied in really well — especially the way you broke down how influencers push certain behaviours and mindsets without people really noticing it. It’s crazy how much overlap there is with how these echo chambers form around ideas like masculinity too.

      As for your question — I honestly think it has to be a mix of both. Platforms definitely need to take more accountability for what their algorithms push, but at the same time, I reckon schools and parents have a big role in helping young people become more aware of what they’re actually consuming. Media literacy should be taught way earlier, and in a way that feels relevant — not just telling kids “don’t spend too much time on your phone,” but actually showing them how filter bubbles work.

      Thanks again for the thoughtful feedback — really enjoyed reading your paper too!

  2. Tayla Black Avatar

    Hi Ben, great paper!

    I’ve noticed when I’m on social media it can be hard to stop scrolling because it can get to a point where it’s almost as if every video is curated for me! I would love to hear more about how filter bubbles can restrict a persons perspective and some examples of how this can be harmful. Would you say that this would be more applicable to politics or are there other topics in which teenagers should have a broader perspective on?

    The rates at which kids under 12 are using social media is really alarming! Obviously most social media platforms do have age restrictions, but it’s so easy to lie. Do you have any ideas as to how these age restrictions could be improved?

    Your paper explores a really prevalent issue in our society, I really enjoyed reading it!

    1. ben.merendino Avatar

      Hi Tayla, thanks so much for reading and for the thoughtful comment!

      I totally relate to what you said about scrolling — that endless stream of personalised content really shows just how powerful filter bubbles can be. You’re spot on asking how they can restrict perspective — while politics is definitely one major area, I’d say it also applies to things like body image, gender roles, and even health misinformation. For teenagers especially, seeing only one side of a topic repeatedly can make it feel like that’s the only valid truth, which makes it harder to build empathy or critical thinking.

      As for age restrictions — yes, it’s crazy how easily kids bypass them! One idea could be linking accounts to verified IDs (though that brings privacy concerns), or maybe making youth-specific versions of apps that offer curated, age-appropriate content and automatically filter out harmful topics. But realistically, I think the biggest solution is more education — teaching digital literacy from a young age so kids know what they’re being exposed to and how to think critically about it.

      Thanks again, I really appreciate your feedback!

  3. Layla Avatar

    Hi Ben I really enjoyed this paper!

    It was very interesting to see the analysis of how filter bubbles shape the beliefs and behaviours of Australian adolescents. The connection between filter bubbles and echo chambers is particularly concerning, as it limits young people’s exposure to diverse viewpoints.

    Do you think social media platforms should be redesigned or regulated to reduce the negative effects of filter bubbles, especially for adolescents? Do you think that there is a way to do this while still maintaining user engagement?

    Your paper is very informative, I really enjoyed reading it!

  4. SammLaw Avatar

    Hi Ben,

    Thanks again for your comment.

    Your paper was really interesting and its a bit of a scary topic cause even though I am aware about the negative effects of endless scrolling I find myself doing it anyway.

    I feel most platforms do not care about the users mental health, they just care that they have users, cause at the end of the day they are a business and money is the most important thing to them. But do you feel that if they are redesigned with stronger restrictions going forward, it would ultimately help or is it not the platforms responsibility?

  5. icannell Avatar

    Hi! I had a great time reading your paper.

    I thought you did a great job at emphasising how filter bubbles affect Australian adolescents. It really is a topic that is close to home. I thought the topic of TikTok’s “For You Page” was a great example that helped further emphasise your point. It makes me raise the question of how this can be avoided. Dealing with young Australians, who have the role of ensuring the effect of filter bubbles do not become an even larger concern. Parents? Educators?

    I noticed some similarities in what I researched about how TikTok’s algorithm fosters micro-communities. Both our papers highlight the powerful role algorithms play in shaping identity, belief, and interaction online.