By the end of the year Australia will become the first western nation to implement a social media ban on under 16s.
Opinions on whether it will work to curb online bullying and other undesirable behaviours are divided and kids being what they are will more than likely find a way around it, industry sources indicate that the demand for VPNs has surged.
Perhaps a better approach might be to lean on the social media platforms themselves to police their content and amend their algorithms or seek to address why kids are seeking out harmful content and behaviours on line.
Many may not recognise that harmful content - simply due to the nature of it often lacking nuance and therefore being easier to turn into bite-sized videos - will be favoured by social media algorithms, and that content creators are financially incentivised to create content that generates high volumes of shares and likes.
Consequently, by lacking nuance or detailed evidence, harmful content is more likely to go viral and come up on their feeds, even if it's partly misinformation or disinformation.
Angelique Wan, CEO and Co-Founder of Consent Labs says, “While it's important to assess the content itself and how to keep young people safe and educated among it, it's just as important to recognise that harmful online content is not just a social media issue, it's a social issue, too.
“Whether young people are looking for it or not, they are coming across a broad range of content, including harmful content that is misogynistic, homophobic or transphobic, racist, and more. It's important for our political, social, education, and community leaders to first and foremost ask why our youth are looking for, finding, and engaging with this content, and grapple with the answers to those questions so we can develop genuinely holistic and effective solutions to rapidly growing and increasingly influential online spaces like the manosphere.”
As the ban will directly affect young people and their access to information, it’s critical for leaders to also listen to young people throughout the process of designing and implementing the ban and its associated policies.
“We need to equip young people with critical thinking and analytical skills to understand how online content is developed, who it serves, and how social media platforms' algorithms work. For example, Embedding media literacy education into the school curriculum would enable young people to fairly analyse any content they are faced with, whether it’s factual or not.
“This is a critical skill that would have a domino effect on safety and fairness in the classroom, and preventing harmful behaviours later in life like sexual violence. To counter the high volume of harmful content already available online, we also need to start creating an influx of healthy, constructive, and evidence-based content so young people have a range of views to assess.”
Jaimee Krawitz, CEO and Founder, Hide N Seek sees firsthand how social media and video content are shaping the mental health of young Australians - both positively and negatively.
"Platforms like YouTube, TikTok, and Instagram have become integral to how children and teens learn, express themselves, and connect with others. However, they also expose young people to a constant stream of content that can promote appearance-based comparison, perfectionism, and disordered eating behaviours. This dual influence needs to be acknowledged in any policy or education response.
“Rather than removing access altogether, we should focus on smart, developmentally appropriate safeguards that reduce exposure to content that reinforces body dissatisfaction or diet culture. This might include changes to algorithms, content filtering tools, and better parental or educator oversight - not just blanket bans that ignore the complexity of youth digital engagement.”
From an eating disorder prevention lens, trends like “What I Eat in a Day” videos and extreme body transformation content can be particularly harmful.
“These videos are often disguised as health or lifestyle inspiration but frequently promote disordered thinking around food, exercise, and body image. Children and teens especially those in emotionally or physically vulnerable stages - may interpret this content as a blueprint for how to look or eat, which can trigger shame, self-comparison, or even restrictive behaviours.
“At the same time, platforms like YouTube also host valuable recovery and mental health content that can be genuinely helpful for young people. There are entire communities of creators and educators who share supportive, recovery-based content focused on self-acceptance, intuitive eating, body neutrality, and mental wellness. For many young people, especially those who don’t feel safe talking to an adult, these digital spaces can provide relief, understanding, and a first step toward seeking help.”
What’s most needed is not just restriction, but education. Children must be taught how to critically engage with online content, including recognising manipulation, harmful beauty ideals, and red flags in “wellness” messaging. This means embedding media literacy and body image education into school programs - and equipping teachers with tools to respond when concerns arise. Equally, platforms themselves must take more responsibility in moderating what is shown to young users.
Online spaces can become places where hate spreads. Without adequate controls or content warnings, young people are regularly exposed to trolling, bullying and racism.
But these harms are not limited to the internet. A 2021 Human Rights Commission survey found over 40% of culturally diverse students had already faced racial discrimination at school, often well before age 16. Because schools often fail to equip students with the language to name it, social media becomes one of the few avenues they have.
Elizabeth Lang and Elizabeth Tekanyo, Co-founders of Racism Register feel banning YouTube might limit some exposure, but it also risks silencing how young people process and respond to racism. Research shows racialised teens are using platforms like YouTube, TikTok and Instagram to share their stories, educate peers and build community (Gatwiri et al. 2022; Quiles-Kwock et al. 2024).
"We have seen firsthand how these platforms can help young people articulate experiences that are ignored or dismissed offline. Instead of blanket bans, we need meaningful support for young people. Tools like TikTok’s comment filters and YouTube Kids’ controls already exist. What is missing is practical guidance for parents and caregivers, and education for young people on how to recognise, report and safely respond to hate, both online and offline,” they said in a statement.