The Rise of AI-Generated Child Abuse: A Growing Threat

Sam Preston 13 November 2024 2 min read
The Rise of AI-Generated Child Abuse: A Growing Threat feature image

The rapid advancement of artificial intelligence (AI) has brought forth a new and disturbing trend: the creation of AI-generated child sexual abuse material (CSAM).

This technology, once intended for positive applications, is now being exploited by individuals to produce hyper realistic images and videos that depict non-consensual and harmful acts involving children.

How AI-Generated CSAM Works

AI-powered image and video generation tools, often referred to as ‘deepfakes,’ can manipulate existing media or create entirely new content. In the case of child abuse, perpetrators can use these tools to:

  • Replace the faces of real children in existing images or videos with the faces of other children, often generated from existing online photos;
  • Create entirely synthetic images and videos of children engaging in explicit acts, using AI algorithms to generate realistic appearances and movements.

The Dangers of AI-Generated CSAM

AI-generated CSAM poses significant risks:

  • Increased Availability: AI makes it easier for perpetrators to create and distribute large amounts of child abuse material, increasing the overall volume of this harmful content online;
  • Realism: The technology is advancing rapidly, making it increasingly difficult to distinguish between genuine child abuse images and AI-generated ones. This can hinder detection efforts and make it harder to protect children;
  • Normalisation: Exposure to AI-generated child abuse material can desensitise individuals to the horrors of real child sexual abuse, potentially leading to increased demand for such content.

As the recent case of 27 year-old Hugh Nelson has shown, the use of AI-generated CSAM can not only result in the sharing of sexually explicit imagery but encourage further depraved behaviour.

Nelson not only used AI technology to create his images, he accepted requests from other individuals via encrypted internet chatrooms. Requests involved asking him to create explicit images depicting children being harmed both sexually and physically, which he sold in exchange for money or shared to others for free. He was sentenced to 18 years' imprisonment with an additional six years on extended licence at Bolton Crown Court last month after previously pleading guilty to a total of 16 charges relating to child sexual abuse offences.

Addressing the Threat

Combating AI-generated child abuse requires a multi-faceted approach:

  • Technological Solutions: Developing AI tools to detect and flag AI-generated CSAM is crucial. This includes improving image analysis techniques and collaborating with tech companies to implement safeguards;
  • Legal Frameworks: Updating laws to explicitly address the creation and distribution of AI-generated child abuse material is essential. This ensures that perpetrators can be held accountable for their actions;
  • International Cooperation: International cooperation between law enforcement agencies, technology companies, and child protection organisations is necessary to share information and coordinate efforts to combat this global issue;
  • Public Awareness: Educating the public about the dangers of AI-generated CSAM and how to recognise and report it is vital. Raising awareness can empower individuals to take action and protect children.

AI-generated child abuse is a serious and emerging threat and, as the case of Hugh Nelson has shown, combating this insidious form of child abuse demands urgent attention. CSAM awareness is key and crucially it should be part of DSL and staff safeguarding training.

Sam Preston

SSS Learning Safeguarding Director


Related podcasts:

See all podcasts
Podcast thumnbail image
Introduction

13 Nov 2023, by Sam Preston & Sara Spinks

Podcast thumnbail image
Gaslighting

11 Jan 2024, by Sam Preston & Sara Spinks


Related courses:

See all courses

Related articles

children playing sport
Abuse in Home Education

by Sara Spinks
SSS Author & Former Headteacher

scales of justice London
Safeguarding adults - do schools have a duty of care?,jimmy prout,duty of care
adults in meeting
A systemic failure to protect children

by Sam Preston
SSS Learning Safeguarding Director

Early Career Teachers
Safeguarding for Early Career Teachers

by Sara Spinks
SSS Author & Former Headteacher