Skip to content

Growing up online: Social Media’s failure to protect underage users.


Abstract:

As social media becomes more and more a part of everyday life for kids and teens, there’s growing concern about what all this online time is doing to their mental health, sense of identity, and privacy. Apps like TikTok, Instagram, and Snapchat give young people fun ways to be creative and stay connected with friends, but they also open the door to some pretty serious problems—like constant pressure to perform, sketchy algorithms, and not-so-great safety measures. Even though there are supposed to be age limits, a lot of kids are getting on these platforms way before they’re legally allowed, which raises some big questions about how easy it is to slip through the cracks and how little oversight there really is.

This paper argues that social media has a huge impact on how young people figure out and present who they are, but the way these platforms are built tends to put making money and keeping people hooked ahead of actually protecting users. By looking at how digital identities are shaped, where platforms fall short, what ethical data practices should look like, and how people are fighting for better protections, the paper shows the constant push and pull between giving kids freedom and leaving them vulnerable. Using research, recent studies, and political discussions, it becomes clear that while more people are waking up to these problems, real policy changes and safer, more ethical designs are still missing. Until there’s stronger regulation, more youth involvement, and a serious focus on protecting kids, social media is going to keep putting the next generation at risk.

Introduction:

It is no surprise that today’s youth are growing up online. A survey by the Pew Research Center, showed that 95% of U.S. teens now have access to a smartphone (Vogels, 2022). It’s becoming a part of daily life for nearly every teenager. Social media platforms like TikTok, Instagram, Snapchat, and YouTube are now more than just entertainment, they are key spaces where young people can connect and express themselves and continue to figure out who they are. A 2021 report by Common Sense Media found that nearly 40% of children aged 8 to 12 reported using social media, as shown in Figure 1, despite most platforms setting the minimum age for users at 13 (Rideout, Peebles, Mann, & Robb, 2021). This big difference between age restrictions and real-world use raises urgent questions about youth safety and their identity development and regulation online.

Figure 1: Top social media platforms used by U.S. tweens (ages 8-12) in 2021. (Source: Ridout, Peebles, Mann, & Robb, 2021, p.5)

Noone will deny that these platforms can offer creative opportunities and social connection. Figure 2 shows that when U.S. teens between 13 and 17 were asked in a 2022 survey, as high as 70% said that social media gives them a space to express their creative side and 80% said they felt more connected to what is going on in their friends lives (Pew Research Centre, 2022). However, social media platforms do also present some serious risks. The American Academy of Pediatrics (2021) highlights growing concerns about anxiety, depression, and cyberbullying, among young users, while other scholars have pointed to risks including performance pressure, unrealistic beauty standards and data exploitation (James et al., 2019).

Figure 2: Percent of U.S. teens who say social media helps them feel more connected, creative and supported. Source: Pew Research Center (2022).

These challenges are further compounded by the fact that most platforms are not designed with youth safety or well-being in mind. Critics argue that many social media companies optimise their platforms for engagement and profitability, often at the expense of ethical design and meaningful regulation (Kidron, 2023). Given that many underage users are active on platforms despite age restrictions, weak age verification systems, combined with engagement driven algorithms, and limited protective measures, leave young users increasingly vulnerable to manipulation, exploitation, and psychological harm (American Psychological Association, 2023).

This paper argues that although social media plays a major role in shaping how current and future young people explore and perform their identities, it is fundamentally flawed in how it treats underage users. A mix of ethical concerns, design failures, and weak regulation have created a digital environment where profit is prioritised over protection. By examining the dynamics of identity development, platform shortcomings, data ethics, and current advocacy efforts, this paper highlights the urgent need for reform. Unless these platforms start being built with young users in mind from design to policy, then social media will remain a space where opportunity and harm exist side by side for the next generation.

Digital Identity and Underage Users

As we have established, social media plays a powerful role in how children and teens can explore and express their identities. As Danah Boyd (2014) explains, online platforms offer young people critical spaces to “hangout”, test boundaries, and perform identity in front of their peers. Platforms like TikTok, Instagram, and Snapchat encourage users to curate their online selves through photos, videos, bios, and shared content, practices that reflect what Papacharissi (2011) calls the performance of a “networked self”. For young users, this becomes a key space to test out who they are, or who they want to be, under the gaze of peers, strangers, and algorithms.

Through algorithmic feedback loops, social media platforms reward visibility and popularity. As Noble (2018) argues, these algorithms are not neutral, they often amplify stereotypes, reinforce inequalities, and prioritise content that drives engagement over wellbeing. Likes, shares, comments, and follower counts act as signals of social value. My own son (aged 10) and daughter (aged 12), although currently only connected on primary school-based platforms, have told me that they choose to only post or share what they think their peers will approve of or find interesting. Even at this young age, they are consciously adapting their online behaviour to fit what they believe their peer expectation is. The result is that young users often tailor their content to gain approval, shifting their behaviour to fit what the platform algorithm rewards, often appearance based or emotionally heightened content. Perloff (2014) notes that such environments can intensify concerns about physical attractiveness and foster self-objectification, particularly among adolescents. So while it can foster creativity, self-expression, and connection, it also fuels a need for external validation.


Papacharissi (2011) describes these phenomena as the “networked self”, where identity is constructed in relation to digital tools, networks, and audiences. In this context, teens may develop a sense of self that is highly performative, focussed not only on who they are, but on how they are seen. This is particularly concerning when trends prioritize physical appearance or controversial content, reinforcing narrow beauty ideals and risky attention-seeking behaviours (Perloff, 2014).


We see empowerment and risk existing side by side. Some young people use social media to build community, advocate for causes, or share talents, reflecting what Jenkins et al. (2016) describes as “participatory culture”, where users don’t just consume content but actively shape and share it. However others may struggle with anxiety, self-doubt, or body image issues linked to constantly comparing themselves, as documented by Perloff (2014) and the American Academy of Pediatrics (2021). Evidence suggests that the pressure to maintain a “likeable” identity online can distort self-perception as young users often shape their content based on peer feedback and platform incentives (Papacharissi, 2011; Boyd, 2014).


Ultimately, identity development in digital spaces is complex and shaped by a mix of peer influence, platform design, and cultural trends (James et al., 2019). While social media can offer powerful tools for self-expression, it can also entrench harmful norms and make self-worth feel dependant on engagement metrics.

Platform Failures and Regulatory Gaps

Despite platform age restrictions, underage users continue to access social media in large numbers (Rideout, Peebles, Mann, & Robb, 2021). Most platforms rely on self-reported ages at sign-up, a method that is easily bypassed by children (Livingston, Stoilova, & Nadagiri, 2019). This lack of robust age verification enables children under 13 to access platforms not designed for their developmental needs or safety.


Algorithm-driven content delivery systems on platforms like TikTok and Instagram are optimised for engagement and often promote emotionally charged or visually striking content (Noble, 2018). While this maximises user attention, it also increases the risk of exposing young users to harmful material, including idealised body standards, sexualised content, and extremist views (American Psychological Association, 2023; James at al., 2019). Meta’s own internal research, leaked in 2021, revealed that Instagram was particularly harmful to teenage girls, contributing to body image issues and mental health struggles (Wall Street Journal, 2021).

YouTube has also come under scrutiny for pushing inappropriate videos through its recommendation algorithm on YouTube Kids, undermining its promise of a child-safe space (Tech Transparency Project, 2022). These platform-level failures suggest that content moderation and algorithmic design are not adequately addressing the needs of young users.

In Australia, the eSafety Commissioner’s reports have consistently highlighted gaps in industry accountability and emphasised the need for stronger protections for children online (eSafety Commissioner, 2025a; eSafety Commissioner, 2025b). Despite growing public concern, industry self-regulation remains the dominant model, with major tech companies continuing to resist oversight that could interfere with profitability (Kidron, 2023). As long as platform design prioritises engagement over wellbeing, and regulatory mechanisms remain weak or unenforced, underage users will continue to face disproportionate risk in digital environments.

Ethical Concerns and Data Exploitation

While today’s youth is spending more time inhabiting online spaces, their growing presence in digital spaces raises urgent ethical questions about privacy, consent, and data exploitation. Unlike adults, minors may not fully understand the implications of data collection, yet there behaviours and actions are being constantly tracked, stored, and profiled by algorithms. As James et al. (2019) point out, children often engage with digital platforms without a full awareness of how their information is being used, creating a concerning gap between participation and informed consent.

Social media platforms profit from collecting user data to personalise content and push targeted advertising. This commercial model, often referred to as surveillance capitalism, treats children’s digital interactions as monetisable assets. Brusseau (2020) argues that these systems reduce digital identity to a bunch of data points, monetising even personal expression. For younger users this means losing a sense of control over how they present themselves online, with algorithms influencing what content they’re shown and even how others see them.

There is also the psychological impacts tied to it all. Being exposed to specific kinds of targeted content, especially posts that push appearance -related pressure or harmful behaviours, can make things worse for teens already struggling. The American Psychological Association (2023) has warned that these algorithm-driven platforms can boost harmful messaging and increase vulnerability, especially in adolescents who are already at risk.

And then we have the issue of consent. Most adults don’t even read or understand the long terms of service agreements let alone our youth, yet their personal data still gets collated on a huge scale, often with little regulation. This raises concerns over the ethical obligations of tech companies that are handling youth data. As Livingstone and Third (2017) stress, children’s rights to privacy and protection online must be prioritised over platform profitability,

In the end, the way young users data is used points to a bigger problem, that there is a clear imbalance of what benefits corporations and what is best for children. Until stronger data protection laws and ethical tech design become the norm, underage users will keep facing risks in an online world designed more for profit than their wellbeing.

The Role of Advocacy and Policy

Lately, there’s been a growing push to tighten up age restrictions and improve verification for underage users on social media. In Australia, Prime Minister Anthony Albanese addressed the issue during an appearance on Sunrise, saying the government is looking into raising the minimum age for social media access to 16(Schrader, 2024). He also touched on the topic again more recently in a casual interview with YouTube creator Ozzy Man Reviews, once again stressing the need to better protect young people online. These comments show that both the public and politicians are starting to take the issue more seriously. Still, many critics argue that actual policy change is lagging behind. As the Guardian has pointed out while age limits are being talked about around the world, there is still a lot of uncertainty about how they would even be enforced. Even with PM Albanese backing a campaign to ban under-16’s from social media, the real world impact remains unclear (Middleton & Taylor, 2024). MP Zoe Daniel put it bluntly, speaking around when it comes to online child safety that governments keep letting tech companies police themselves and it’s our children who are paying the price (Daniel, 2024).

Watch the Ozzy Man Reviews interview here: https://www.youtube.com/watch?v=lmmqy-YI2RU

Advocacy groups and researchers have been and are still pushing for stronger safeguards and greater accountability. UNICEF (2021), for example recommends using age-appropriate design, stricter data protection policies, and frameworks that centre around children’s rights to create safer digital environments. In Australia, the eSafety Commissioner promotes a model called the “Safety by Design”, which urges tech companies to build safety and privacy into platforms from the very beginning (eSafety Commissioner, 2023). But so far, the rollout of these ideas has been slow, and actual enforcement is still pretty limited.

Some researchers, like Ellis and Goggin (2017), argue that education and activism are key to making a difference. Programs that teach digital literacy, things like how algorithms work, what data is collected, and how to navigate online spaces critically and safely, are gaining traction in schools. That being said, these initiatives are not consistently funded or widely adopted yet. Without big-picture reforms and real cooperation across sectors, even the best advocacy work can only go so far.

In the end if we want real change then policy needs to come with enforceable rules and be shaped with kids perspectives in mind. That means lifting up youth voices, holding platforms accountable, and getting governments, educators and tech companies to actually work together. It is the only way we will be able to build a digital world that puts young peoples safety first.

Conclusion

As someone who sees this happening with my own kids, it’s clear that social media plays a huge part in how young people figure out who they are and connect with others and just get involved in the world around them. But for the young people 18 and under, these online spaces still have some major underlying flaws. Most platforms are built to maximise engagement and boost profits, which means they often ignore what younger users actually need in terms of safety and healthy development. As this paper has discussed, the risks go beyond just technical or legal concerns, they are also deeply personal and emotional. Underage users face everything from identity struggles and algorithm-driven pressure to data misuse and a general lack of proper safeguards. These platforms were not designed with them in mind and it shows.

People have definitely been talking more about this lately, including both advocates and politicians, about the need for change, which is a great start. But real reform is still moving slowly. Unless governments start enforcing stronger rules, tech companies commit to designing ethically, and young people actually get a say in shaping the digital spaces they use every day, the gap between the potential of social media and the protection of young users need isn’t going away anytime soon. Hoping tech companies will just do the right thing isn’t enough anymore. If we want the next generation to be safe online, then its time online spaces are built with real care and their well-being at the core. After all, it might not seem urgent until it hits close to home and one day, heavens forbid, it may be your loved one that is at risk.

References:

Vogels, E. A (2022, August 10). Teens, social media and technology 2022. Pew Research Centre. https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/?utm_source=chatgpt.com

Rideout, V., Peebles, A., Mann, S., & Robb, M. B. (2021). The Common Sense Census: Media use by tweens and teens, 2021. Common Sense Media. https://www.commonsensemedia.org/research/the-common-sense-census-media-use-by-tweens-and-teens-2021

Pew Research Centre. (2022). Teens, social media and technology 2022. https://www.pewresearch.org/internet/2022/11/16/connection-creativity-and-drama-teen-life-on-social-media-in-2022/

American Academy of Pediatrics. (2021). The impacts of social media on children, adolescents, and families. https://publications.aap.org/pediatrics/article/127/4/800/65133/The-Impact-of-Social-Media-on-Children-Adolescents

Perloff, R. M. (2014). Social media effects on young women’s body image concerns: Theoretical perspectives and an agenda for research (2014). https://link.springer.com/article/10.1007/s11199-014-0384-6

James, C., Davis, K., Charmaraman, L., Konrath, S., Slovak, P., & Weinstein, E. (2019). Digital life and youth well-being, social connectedness, empathy and narcissism. https://publications.aap.org/pediatrics/article/140/Supplement_2/S71/34171/Digital-Life-and-Youth-Well-being-Social

Kidron, B. (2023) Beeban Kidron on why children need a safer internet, Centre for International Governance Innovation. https://www.cigionline.org/big-tech/beeban-kidron-why-children-need-safer-internet/

American Psychological Association. (2023). Health advisory on social media use in adolescence. https://www.apa.org/topics/social-media-internet/health-advisory-adolescent-social-media-use

Maiden, S. (2024) ‘Enough is Enough’: Fed up PM confirms nationwide social media ban. News.com.au. https://www.news.com.au/technology/online/social/enough-is-enough-fed-up-pm-confirms-nationwide-social-media-ban/news-story/63f4be8620e0d7de2347ca33e0ae7e13

Daniel, Z. (2024 September) Time to rein in the technology giants. https://zoedaniel.com.au/2024/09/09/time-to-rein-in-the-technology-giants-the-australian-9-sept-2024/

Middleton, K., & Taylor, J. (2024). Albanese backs minimum age of 16 for social media. The Guardian. https://www.theguardian.com/australia-news/article/2024/may/21/anthony-albanese-social-media-ban-children-under-16-minimum-age-raised

Papacharissi, Z. (2011). A networked self: Identity, community, and culture on social network sites. Routledge.

Boyd, D. (2014). It’s complicated: The social lives of networked teens. Yale University Press.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.

Livingstone, S., Stoilova, M., & Nandagiri, R. (2019). Childrens data and privacy online: Growing up in a digital age. London School of Economics and Political Science. https://eprints.lse.ac.uk/101283/?_gl=1*1wgnqo6*_ga*MTY4MTA5NjkxOS4xNzQ0MjAwMDUx*_ga_LWTEVFESYX*MTc0NDIwMDA1MC4xLjEuMTc0NDIwMDA3My4wLjAuMA..

Wall Street Journal. (2021). Facebook Knows Instagram Is Toxic For Teen Girls, Company Documents Show. https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739

Tech Transparency Project. (2022). Guns, drugs, and skin bleaching: YouTube Kids still poses risks to children. https://www.techtransparencyproject.org/articles/guns-drugs-and-skin-bleaching-youtube-kids-still-poses-risks-children

eSafety Commissioner. (2025a). Behind the screen: Transparency report – Feb 2025. Office of the eSafety Commissioner. https://www.esafety.gov.au/sites/default/files/2025-02/Behind-the-screen-transparency-report-Feb2025.pdf

eSafety Commissioner. (2025b). Mind the gap: Parental awareness of children’s exposure to online harm. Office of the eSafety Commissioner. https://www.esafety.gov.au/research/mind-the-gap

eSafety Commissioner. (2023). Safety by design: Industry reports and research. Australian Government. https://www.esafety.gov.au/industry/safety-by-design

Brusseau, J. (2020). Ethics of identity in the time of big data. https://philarchive.org/archive/BRUEOI-2

Livingstone, S., & Third, A. (2017) Children and young people’s rights in the digital age. New Media & Society. https://journals.sagepub.com/doi/10.1177/1461444816686318

Ellis, K., & Goggin, G. (2017). Disability and social media: Global Perspectives. Routledge.

UNICEF. (2021). Policy guidance on AI for Children. https://www.unicef.org/innocenti/reports/policy-guidance-ai-children

YouTube. (2025). Anthony Albanese talks to Ozzy Man Reviews [Video]. YouTube. https://www.youtube.com/watch?v=lmmqy-YI2RU

Jenkins, H., Ito, M., & Boyd, D. (2016). Participatory culture in a networked era: A conversation on youth, learning, commerce, and politics. Polity Press.

Share this:

Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XVI Feedback Form.

Comments

21 responses to “Growing up online: Social Media’s failure to protect underage users.”

  1. Benn van den Ende Avatar

    Hi 22438832 (I’m guessing that’s your student number),

    I really enjoyed this paper, specifically because it looked at the political economy behind social media platforms (i.e., they are fundamentally money making endeavours and this influences how we must think about them from a policy perspective).

    You mention policy in your last section and, though this might be outside the scope of your paper, I’m wondering what policies you think would be more effective? Do you think, for example, that we can regulate the existing platforms or perhaps we should look at creating new platforms with safety built into them at the foundational level?

    Thanks!

    1. 22438832 Avatar

      Thank you Benn, I really appreciate your feedback and I am very glad you liked my paper and the fact I approached this topic from a political economical view point. I am sad that I feel my marks do not reflect this (said tongue in cheek).

      I do believe, in fact, in regulating most platforms to be able to make them age apropriate we could use something like a unique identification card something like a birth certificate or a passport but something less valuable and this will work to aid most but not fix all problems with age restriction. It would need to be something like a NFC chip that computers and phones could authenticate and many digital coins already use something like this for authentication. Either that or biometrics or perhaps both together, but first the main big call is for government to get serious about protecting children and enforcing big companies like meta to impose these authentication restrictions. This will be hard as many big tech companies dont want to miss out on the big incentives they currently get by collating data on youth internet use and patterns and trends, what they find interesting and where in the world it is vs where it is not interesting.

      But I speak to this as a Dad watching my kids want to fit in and want to be like their friends on social media which makes it hard being the uncool parent, though I believe it is what is right for them now.

      Lyam.

  2. George Warner 22224293 Avatar

    Great paper, i enjoyed the very informative style on how young users’ identities are shaped on social media. This is very similar to my paper, where I focused on how Instagram rewrites and changes people’s identities to be more performative and less authentic. I think you would enjoy reading about Goffman’s theories of self-expression; you can read my paper here “https://networkconference.netstudies.org/2025/ioa/5959/selfconscious-how-instagram-rewrites-and-destroys-identity/”. Your paper did make me think, and I would love to hear your opinion on whether social media platforms are doing anything to protect the youth using these sites?

    1. 22438832 Avatar

      Hey George,

      I really appreciate you taking the time to read my paper and thanks for the kind words! It definitely sounds like we’re thinking along the same lines. I’m super interested to check out your work, especially how you’re using Goffman’s stuff on performativity and self-presentation. That framework makes so much sense when it comes to understanding how people shape their identities online.

      On the question of whether platforms are actually protecting young users, yeah, I think they’re doing something, but honestly, it often feels half-hearted. Things like content restrictions and screen time settings and privacy controls are there, but they’re usually optional and easy to get around and not really enforced. It still feels like the burden is mostly on the users (or their parents), instead of the platforms building in real safety by default.

      A lot of it just seems reactive, like they are waiting for backlash before making any real changes. Unless there’s serious public pressure or actual regulation, I’m not super hopeful we’ll see major shifts anytime soon. But maybe that’s starting to change. Fingers crossed.

      Thanks again for reaching out, really looking forward to reading your paper!

      Lyam

  3. Cindy Ma Avatar

    Hi, your paper is very well-research by the way you including some metrics figure to support yout paper. Your paper thoughfully explore the challenges of undersage social media users. I especially liked how you addressed both individual and systemic issues, like platform responsibility and algorithmic influence. Great job, well done. If you have time, feel free to visit the link and check out my paper too https://networkconference.netstudies.org/2025/onsc/5187/gig-workers-utilise-reddit-and-x-to-advocate-for-better-working-conditions-and-social-change/

  4. Cindy Ma Avatar

    Hi, your paper is very well-research by the way you including some metrics figure to support yout paper. Your paper thoughfully explore the challenges of undersage social media users. I especially liked how you addressed both individual and systemic issues, like platform responsibility and algorithmic influence. Great job, well done. If you have time, feel free to visit the link and check out my paper too https://networkconference.netstudies.org/2025/onsc/5187/gig-workers-utilise-reddit-and-x-to-advocate-for-better-working-conditions-and-social-change/

  5. 22438832 Avatar

    Hey Cindy,

    Thanks so much for the thoughtful feedback, it really means a lot! I’m glad the metrics part came through clearly, and I really appreciate you noticing how I tried to balance the individual experiences with the bigger systemic stuff. That felt like a key part, especially since so many young people are on these platforms without much in the way of real protection.

    Your paper sounds awesome too, I’m excited to read it! The way gig workers are using spaces like Reddit and X to push back and organize is super relevant right now. I’ll definitely give it a read and drop a comment when I do.

    Thanks again for taking the time to dive into my work!

    Lyam

  6. Khushi Avatar

    Hey Lyam,

    Very interesting paper! Being someone with a younger sister that’s been online since she was young this paper really touched me. “Growing up online” does a wonderful job at demonstrating the ways in which social media can be used both to empower and hurt children. I liked that you didn’t simply demonise such apps as TikTok and Instagram, you mentioned that there was a creative potential among its users, and also that these apps weren’t designed with the younger users in mind.

    That part of kids earning a living off likes and making their identities algorithmic hit home. My sister even worries about how she looks when creating videos to send to friends even though she is still in primary school. Its scary to think what will happen once she gets access to posting on social media…The same goes for the discussion on data ethics, which was also eye opening, the concept that kids are being tracked and profiled without a proper idea about it is frankly terrifying.

    I liked the emphasis on advocacy and on policy, and I appreciated the reference to real action (or inaction) by leaders in Australia. It goes without saying that change is needed – and urgently. This paper forced me to think deeper about the type of digital world my sister is growing up in (the internet), and how much more could be done to make it safe.

    Overall very relevant topic and great work!

    1. Lyam Temple Avatar

      Hey Khushi,

      Thanks so much for your thoughtful comment, it honestly means a lot. I’m really glad the paper resonated with you, especially with the connection you mentioned about your younger sister. The fact that she’s already self-conscious, even when just sending private video messages, really says a lot about how early this stuff starts.

      It’s amazing how creative and expressive kids can be online, but at the same time, it’s pretty heartbreaking knowing they’re doing it in such a complicated and often unsafe digital space. I totally share your concerns about where things are headed, and yeah, I agree that real change is needed, both in how these platforms are built and in the policies that regulate them.

      Thanks again for reading and taking the time to share your thoughts. These kinds of conversations are so important.

      Lyam

  7. Layla Avatar

    Hi Lyam! This is a well-researched paper that I really enjoyed reading!

    I especially enjoyed seeing how you wove together identity development, ethical concerns, and data exploitation into a cohesive narrative, making clear how these issues intersect and compound harm for underage users. Your use of current political discourse and advocacy efforts added urgency and relevance to your argument.

    I’m curious to know how you might further explore what youth-centered platform design could look like? Are there existing platforms or initiatives that serve as potential models?

    Overall I really enjoyed this paper! It would be amazing if you could check out my paper and let me know your thoughts; https://networkconference.netstudies.org/2025/csm/5477/are-influencers-in-adult-content-impacting-minors-negatively-the-impact-of-tiktoks-strong-online-communities-on-young-people/

    1. Lyam Temple Avatar

      Hey Layla,

      Thanks so much, I really appreciate your feedback, it honestly means a lot! That’s a great question. I think youth-centered platform design really needs to focus on limiting algorithmic manipulation and setting age-appropriate defaults from the start. Ideally, these kinds of systems should be developed with input from both young users and safety experts so they actually reflect real needs.

      We might already be heading into a future where AI can verify age through camera authentication, but that tech would need to be sharp enough to catch things like someone using a photo or deepfake to spoof the system. It’s definitely a complex issue, but the tech seems to be catching up quickly.

      I’m looking forward to checking out your paper too, it looks super relevant!

      Lyam

  8. Busher Avatar

    Hi,

    This was a great read! It’s interesting to see some of the dangers that young people face online. I do have to ask though, do you think social media platforms should primarily be the ones responsible for protecting kids online? I do agree they have a large role to play and can make the biggest change, but should we also shift more focus toward equipping young users with media literacy skills from an early age?

    You might be interested in reading my paper which looks at the negative effects Instagram influencers can have on teenage girls’ identity formation: https://networkconference.netstudies.org/2025/ioa/6031/the-price-of-perfection-the-impacts-social-media-influencers-have-on-teenage-girls-identities/

    1. Lyam Temple Avatar

      Hey Busher,

      Thanks so much for your comment, I really appreciate you taking the time to engage with my paper! You raise a great point, and I totally agree that media literacy has to be a bigger part of the conversation. Giving young users the tools to think critically about what they’re seeing and how they’re presenting themselves online is so important, especially as these platforms aren’t always designed with their best interests in mind.

      That said, I still think platforms need to take more responsibility, it shouldn’t fall entirely on kids (or schools or parents) to navigate environments that are often built around exploitation and performance. Ideally, it’s both: stronger education and better design.

      I have given your paper a read too, really powerful work. The focus on influencers and how they shape teen girls’ sense of self was super compelling. So much of that ties into what I was writing about as well. Thanks again for the thoughtful exchange!

      Lyam

  9. Jayne Avatar

    Hi Lyam,

    Thank you for sharing your thought-provoking paper. You strongly define the dangers and failures of tech companies and governments to put in the necessary guardrails for our youngsters, and this leaves them vulnerable to the pressures to perform, trying to fit in, build identities and the danger of platform algorithms.

    Your paper made me reflect on the duality of as you describe of the “opportunity and harm” that exists in social media.

    I do note there are opportunities as well, and I wondered in your research if your saw any tech companies that had started to put more ‘care at the core’ as you describe into what they are building. Especially as you highlight – we are all at risk.

    Thank you for you very thoughtful paper.

    All the best

    Jayne

  10. Lyam Temple Avatar

    Hey Jayne,

    Thank you so much for your kind words, I really appreciate you taking the time to read my paper and share your thoughts. I’m glad the “opportunity and harm” duality resonated with you, it’s such a tricky balance, and one that feels especially urgent when young people are involved.

    That’s a great question, too. In my research, I did come across a few platforms that are starting to move in the right direction, like adding more robust safety tools and improving content moderation. But even then, it often feels like surface-level stuff or something they roll out after public pressure. There’s still a long way to go before “care” becomes a core design value rather than an afterthought.

    Also, I read your paper and really enjoyed it, the way you explored self-representation and the shift from traditional media to more self-directed narratives was super compelling. It’s such a powerful reminder of how much agency people can reclaim online, even within flawed systems.

    Thanks again for your thoughtful comment, it means a lot to hear that my paper sparked reflection.

    All the best,
    Lyam

    1. Jayne Avatar

      Hi Lyam,

      It is good to hear that you found some movement within tech companies to protect the young, but I think you are correct that it is quite superficial really (sadly). I do like your description of ‘care at the core’ and that feels like it should be a rallying cry. It feels like as a society we got so excited with the freedom and accessibility to online content – that we forgot to look after the vulnerable.

      Thank you – I am glad you enjoyed my paper – which does aim to highlight some of the advantages of online accessibility. Let’s hope we can as a society bring ‘care’ into the design that affords both access and safe exploration.

      All the best

      Jayne

  11. JordanUhe Avatar

    Hi 22438832,
    You make arguments for restricting children’s access and for educational programs to lower how susceptible children are.
    To restrict children from social media, either you will need to confirm the users age with government identification, and or use age detection algorithms to analysis the users face to decide if they are of age.

    My question for you: Is the protection of children an excuse to better govern and control who says what online? is the wellbeing of children more important than online anonymity?
    A follow up question: Is the fear which people have for their privacy, what’s holding regulation back?

    I would argue that the protection of children is often used as an excuse for government to better surveil the public.

    The UK’s controversial Online Safety Bill finally becomes law | The Verge
    https://www.theverge.com/2023/10/26/23922397/uk-online-safety-bill-law-passed-royal-assent-moderation-regulation
    quote:
    “Messaging apps like WhatsApp and Signal have objected to a clause in the bill that allows Ofcom to ask tech companies to identify child sexual abuse content “whether communicated publicly or privately,” which the companies say fatally undermines their ability to provide end-to-end encryption.”

    This Bill in effect removes the ability of encrypted end to end communication, this raises privacy concerns, if the platform also holds a set of keys for each conversation, that makes another security flaw which can be abused and subpoenaed by the government. If the UK government can subpoena Signal, what stops China from doing the same to crack down on dissent.

    I am aware of the slipper slow I am arguing on, but however these policies get implemented have huge repercussions.

    From Jordan

  12. Lyam Temple Avatar

    Hi Jordan,

    Thanks for your thoughtful comment. You raise important concerns about the trade-off between child protection and personal privacy. I’ve worked in security for most of my adult life, across hospitals and organisations, often with vulnerable individuals. Almost every day of my career has been recorded, and I’ve come to see responsible surveillance not as a threat, but as a safeguard.

    In my experience, arguments against surveillance or governance often come from those seeking to avoid accountability. I personally believe that if systems are secure and not abused, I wouldn’t mind being recorded from start to finish. I have nothing to hide. Real freedom isn’t just about privacy, it’s also about safety. The freedom to walk down the street without fear, the freedom for children to explore without being targeted, the freedom to face consequences when harm is done.

    Yes, I want my children to grow up in a world where they are safe and free to make choices, but also free to be protected from the consequences of others’ wrongdoing. It saddens me that in trading real-world community for digital connection, we’ve lost a sense of collective care and responsibility.

    I’ve heard the concerns: “I don’t want the government spying on me” or “what if someone sees me doing something embarrassing.” But I would ask, if someone hurts your loved one or damages your property, wouldn’t you want evidence to hold them accountable?

    To me, opposing all forms of surveillance in the name of privacy is often a way to justify freedom without responsibility. I believe we need to build a society where privacy and safety aren’t opposing forces, but work together.

    Would love to hear your take on how we might build those safeguards without opening the door to abuse.

    1. Lyam Temple Avatar

      Sorry, also,

      As a father of six, the protection and safety of my children come before anything else, even before what some might frame as a child’s “right” to access harmful or unfiltered content. Through my work with vulnerable individuals, I’ve seen firsthand the damaging effects social media and online exposure can have. Limiting access to spaces that aren’t age-appropriate isn’t about control, it’s about care. It’s no different than setting age limits for movies or keeping young kids away from dangerous roads. We don’t let children wander into adult-only spaces, and I see no issue applying the same logic to online environments.

      I also hold strong views about shielding children from ideologies and content I believe to be harmful. I stand firmly against what I see as inappropriate or ideologically driven material being introduced in primary school settings, especially when it comes to things like drag performers being positioned as educational figures. I wont stand for this and will not tolerate this kind of child abuse.

      Would standing for these boundaries also be seen as calling for more governance? Perhaps by some, but I see it instead as drawing a line in defence of childhood. We need to stop confusing restriction with oppression when the intent is clearly protection.

      Curious to hear where others believe the balance lies between protecting the young and preserving rights.

  13. Xing Bai Avatar

    Hi Lyam,

    What an Interesting paper ! I appreciate how you dug into not just the technical and policy gaps, but also the real emotional and identity that struggles kids face online. Quick question , you mentioned about the education and digital literacy as part of the solution, but I just wonder with the tech and algorithms changing so fast now , do you think schools and parents can realistically keep up?

    1. Lyam Temple Avatar

      Hi Xing,

      Thank you! That’s such a thoughtful question, and honestly, not an easy one to answer. With how fast tech is moving, I don’t think schools or parents can realistically keep up in real time. But I do think we can focus on giving kids strong critical thinking skills that stick with them, no matter the platform. Teaching them how to question what they see, spot manipulation, and understand privacy risks might be more effective in the long run. Maybe the real goal isn’t keeping up with every new trend, but helping kids build resilient habits that let them navigate whatever comes next.

      Lyam