Ofcom's new online safety codes - a turning point for child protection?

Sara Spinks 3 min read
Ofcom's new online safety codes - a turning point for child protection? feature image

Despite recent advances in UK online safety legislation, serious gaps remain in how platforms like Instagram protect children from harm. This raises critical concerns for educators, safeguarding leads, and parents alike.

A hard-hitting report by the 5Rights Foundation has revealed that Instagram’s ‘Teen Accounts’, introduced in September 2024 as a safeguard for under-18s, are still exposing young users to harmful and adult content, even though the platform claims protections are in place. Testing the system, researchers behind the study easily bypassed age restrictions using fake birthdays; gaining immediate access to sexualised imagery, adult account recommendations, hateful commentary, and commercial content pushing harmful beauty ideals.

Meta, Instagram’s parent company, has defended its ‘Teen Account’ design, calling the report ‘inaccurate’ and pointing to their survey data indicating that 94% of parents find the protections helpful. However, campaigners argue the platform’s systems are easily manipulated and fail to prevent exposure to content that can negatively affect children's mental health, body image, and emotional well-being.

Baroness Beeban Kidron, founder of 5Rights Foundation, responded unequivocally stating:

‘This is not a teen environment. They are not checking age, they are recommending adults, they are putting them in commercial situations without letting them know, and it's deeply sexualised.’

These concerns come just as Ofcom has finalised the long-awaited Children’s Safety Codes, the first to take legal effect under the Online Safety Act. The Codes impose strict new obligations on digital platforms, requiring them to transform their operations to safeguard under-18s, or face substantial penalties.

Effective from 25 July 2025, the rules mandate over 40 actions that platforms must implement, including:

  • Age-appropriate algorithm adjustments to block harmful content
  • Robust, effective age verification
  • Swift response to harmful content
  • Child-friendly terms of service
  • Options to block group chat invitations
  • A named individual accountable for children’s safety
  • Annual safeguarding risk assessments reviewed at senior level

Platforms must also protect children from content related to self-harm, suicide, eating disorders, misogyny, violence, and abuse. They may be fined up to £18 million or 10% of global turnover if they fail to comply.

Ofcom Chief Executive Dame Melanie Dawes has described the codes as a ‘gamechanger,’ acknowledging that while no solution is foolproof, the new rules give regulators real teeth, adding that: ‘If [tech companies] want the privilege of serving under-18s in the UK, they must change how those services operate.’

Despite this statement, others remain deeply sceptical. Ian Russell, father of 14-year-old Molly Russell, who died after viewing harmful content online, said the codes lacked ambition and failed to put child safety above tech industry interests.

Similarly, Hollie Dance, mother of Archie Battersbee, labelled the reforms a ‘small step forward’ but feels they do not go far enough to prevent future tragedies. Commenting she asked the key question:

‘Why are we tiptoeing around these platforms when children’s lives are at stake?’

A viewpoint mirrored by SSS Learning Safeguarding Director, Sam Preston. In the Safeguarding Conversation podcast series, on the new codes, Ms Preston highlighted how problematic the implementation of regulatory fines would be when taking on tech giants due to their history of not responding to external findings such as the 5Rights report and their vast funds available for fighting legal action. In the podcast, Ms Preston stated:

‘I fail to understand the history of not holding tech companies accountable, but the fact this has happened makes it more difficult to retrospectively introduce measures. For too long this industry has, from a safeguarding view, been very poorly regulated.’

This is a critical juncture in online safety for DSLs, school leaders, and professionals working with children. While regulatory reform is underway, the findings from 5Rights highlight the urgent need for vigilance, education, and advocacy.

Key considerations for schools include the following:

  • Assume age controls can be bypassed and educate children and parents accordingly
  • Monitor children’s use of social media platforms and engage in open, supportive dialogue
  • Use the new Codes of Practice as a framework to review your organisation’s digital safeguarding strategy
  • Advocate for stronger accountability from platforms and push for continuous improvement

Online safety can no longer be a vague ethical concern, it should be viewed as a legal, professional, and moral duty. As the Online Safety Act helps reshape the digital world, safeguarding leads must remain proactive, critical, and unafraid to challenge the systems that still fall short. Platforms may promise protection, but real safety requires scrutiny. The big question is, will Ofcom be able to achieve this?

Sara Spinks

SSS Author & Former Headteacher

28 April 2025


Related podcasts:

See all podcasts
Podcast thumnbail image
The State of Childhood Wellbeing

08 Mar 2025, by Sam Preston & Sara Spinks

Podcast thumnbail image
Mobile Phones in Schools

22 Apr 2024, by Sam Preston & Sara Spinks


Related courses:

See all courses

Related articles

children with hands up in a classroom
nursery school teacher with children
New EYFS Safeguarding Consultation

by Sam Preston
SSS Learning Safeguarding Director

Working together to safeguard children - the key changes
Working Together to Safeguard Children

by Sara Spinks
SSS Author & Former Headteacher

Online Safety Bill
The Online Safety Bill

by Sara Spinks
SSS Author & Former Headteacher