DfE AI Guidance for Schools

The Department for Education (10 June 2025) has released a full package of resources to support schools and colleges in navigating the use of artificial intelligence (AI) in education.
The guidance includes practical tools, training modules, and leadership materials aimed at helping school staff use AI safely, effectively, and in line with the law. It’s a welcome step but is it enough? Will leaders still feel unsure about how to move forward with confidence?
The new suite of materials includes:
- A PowerPoint presentation to guide internal discussions among senior teams
- Leadership videos and transcripts exploring key topics such as bias, data protection, and educational value
- A structured training module on safe and responsible use of generative AI
- An audit tool to help schools understand their current position and plan next steps
Schools are encouraged to adapt these tools to their own context, but the message is clear: AI is here, and schools need to be ready.
What Exactly Is Generative AI?
The guidance defines generative AI as software powered by large language models (LLMs), such as ChatGPT, Google Gemini or Microsoft Copilot. These tools are capable of generating human-like text, images, code and more. Their power lies in pattern recognition across vast datasets, but there are known risks.
The DfE cautions that whilst AI can help with tasks like planning, marking and admin, it must always be used with human oversight. Tools can ‘hallucinate’ (produce false or misleading information), reproduce bias, or mishandle sensitive data. As the guidance states: ‘any use of generative AI by staff, students, and pupils should be carefully considered and assessed, evaluating the benefits and risks of use in its education setting’. Whatever the purpose, staff must critically review any content AI produces.
Risk and Responsibility
Schools are reminded that they must uphold legal duties around:
- Safeguarding:
- protecting children from harm and ensuring tools are age-appropriate
- Data protection:
- under UK GDPR, student data must not be shared with platforms unless legally compliant
- Intellectual property:
- content created using AI must respect copyright and usage rights
The guidance emphasises that many popular AI tools are restricted to users aged 18+, meaning pupil access should be treated with caution. While the DfE allows schools to decide their own rules on AI use, the message is to tread carefully, particularly when it comes to student use.
The Curriculum Connection
The DfE stresses that AI tools should support, not replace, subject knowledge. Students still need a strong foundation in reading, writing and critical thinking to use these tools effectively. Without that, there’s a risk of undermining learning rather than enhancing it.
The guidance encourages schools to think carefully about how AI fits within existing teaching strategies, rather than adding it as a bolt-on or quick fix.
Assessment and Exams
The DfE is also clear that assessment integrity must be protected. Schools are advised to follow the Joint Council for Qualifications (JCQ) guidance to ensure that coursework, homework, and exams remain a true reflection of pupils’ own work. In some cases, this may mean shifting back to supervised assessments or handwritten work.
HM Government has backed AI in education with significant investment. This includes:
- £1 million to support the development of AI marking and feedback tools
- £3 million to improve AI training datasets for education
- Ongoing research to evaluate the real-world impact of AI in schools, with Ofsted contributing to evidence gathering
However, beyond funding and pilot schemes, many school leaders are still calling for more joined-up direction and consistent standards.
In partnership with the DfE, the Chartered College of Teaching is offering a free certified assessment for school staff. Teachers and leaders can use the DfE materials, complete the training, and then take an online assessment to evidence their understanding of safe, ethical AI use in schools. This is an opportunity for staff keen to develop their digital confidence.
The DfE’s new materials represent a positive step forward. Schools now have access to free training, policy templates, and practical guidance on AI. But for many, the key issues remain:
- What platforms are actually safe to use with pupils?
- How do we balance innovation with legal and ethical responsibilities?
- Should AI use be limited to staff only for now?
- And how do we ensure consistency across the sector?
There’s no doubt of the role that AI will play in the future of education. But until clearer national expectations are set, schools will continue to carry much of the burden themselves. Whilst focussing on training staff and reviewing internal policies, schools need to proceed with care. Whilst they can’t afford to be left behind, they also can’t afford to get this wrong.
- SSS Learning Training Course – Safeguarding & Child Protection for Staff in Regulated Activity
- SSS Learning's Complete Safeguarding Training Suite
Sara Spinks
SSS Author & Former Headteacher
23 June 2025