Extremist Memes and Internet Subcultures – What Teachers Need to Know

At first glance, it might just look like a cartoon frog, or a blurry image with bold text that seems like nonsense. But in the corner of the internet where irony, humour, and ideology collide, these ‘memes’ can mean much more and some carry messages that are far from harmless.
Extremist groups have long known how to utilise digital culture however, over recent times the tone has changed. Much of the content now optimises humour. Through the use of memes. It’s fast, sarcastic, shareable, and worryingly effective at drawing in younger audiences. So understanding how internet subcultures operate is not only important for online safety leads, but also essential for everyone who works with young people.
A meme is usually a funny image, video, or phrase shared online. Most are harmless. But some fall into the category of extremist memes, using them to spread hate or radical ideas, all wrapped in humour to disarm potential criticism.
Far-right groups in particular have developed memes that promote racist, antisemitic, misogynistic or conspiratorial messages. They mock violence, promote ‘white pride’, or use symbols and phrases tied to extremist ideology.
Although these memes spread through fringe forums like 4chan or Reddit, they don’t stay there. They travel fast, often appearing on TikTok, Discord, or even in gaming chats. Some memes become so mainstream that their original meaning gets lost, until someone digs deeper.
How Subcultures Use Humour to Spread Harm
One of the biggest risks is that this content is meant to feel like a joke. That’s what makes it appealing. However these subcultures are using humour to spread harm. For some children, the constant exposure to this type of content can slowly shift their perspective, the line of what feels acceptable.
- SSS Learning Training Course – E-Safety Training for School & Academy Staff
- SSS Learning's Complete Safeguarding Training Suite
A report from the Centre for Countering Digital Hate found that memes were being used strategically by far-right groups to ‘groom’ teenagers into their ideology through in-jokes, irony, and layered meaning. Often, the humour creates enough distance that children themselves may not even realise what they are being drawn into.
The Ofcom Media Use and Attitudes Report highlights that a significant proportion of teenagers aged 13–17 have encountered potentially harmful content online, with exposure often occurring through platforms featuring unmoderated gaming content and online forums.
Teenagers are wired for exploration. Online, this often means seeking edgy, offbeat content that feels outside the mainstream. Add in the pull of community and the thrill of rebellion, and internet subcultures can become very appealing.
Young people may be drawn to this world because:
- It offers a sense of belonging when they feel out of place elsewhere
- It allows them to feel ‘in the know’ by decoding layered meanings
- It gives them a voice, even if that voice is shaped by anger, fear, or confusion
- It taps into humour and sarcasm, making it feel more like play than ideology
The danger comes when the joke hardens into belief - or when they start seeing harmful views as normal.
While it’s important not to overreact, it helps to recognise certain signs that might suggest a pupil is engaging with or being influenced by extremist subcultures:
- Pepe the Frog
- Originally innocent, but adopted by white nationalist movements and hate forums
- ‘NPC’ or ‘Red Pill’ terminology
- Terms that suggest followers see others as mindless or ‘woke’
- Numbers like 14 or 88
- These may refer to coded neo-Nazi slogans
- Use of Wojak or Chad memes
- Often part of ‘incel’ or misogynistic memes mocking women or ‘weaker’ men
- Irony-soaked phrases like ‘based’ or ‘clown world’
- Popular in online spaces that dismiss progressive values
None of these are automatically dangerous on their own. But in context, and if combined with isolation, anger, or sudden changes in belief, they may be red flags.
The Online Safety Act adds vital context to this issue. Under the Act, online platforms now have a legal duty to assess and reduce the risk of children encountering harmful content, including extremist material disguised as humour. While memes or in-jokes may seem trivial, if they promote hate, incite violence, or encourage radical views, platforms are expected to act. Ofcom, now the official regulator, has powers to fine or block services that fail to comply.
For schools, this means digital safeguarding doesn’t stop at the school gates. Education and early conversations are more important than ever, helping young people recognise what they’re being shown and why. Teachers are key to building the resilience the law alone can’t provide.
So what can schools do?
Schools haven’t got the resources to become digital surveillance hubs, but do have a safeguarding responsibility. Staff training in awareness and early intervention can have a significant impact. As we know from managing other safeguarding topics, open conversations can be very effective.
This includes:
- Including meme literacy in digital citizenship lessons. Teaching pupils to think critically about the content they consume and share
- Not dismissing controversial humour out of hand. Exploring why it’s appealing and what it might signal underneath
- Creating space for discussion. Giving students room to talk about online content without fear of punishment. This builds trust
- Encouraging positive digital identity. Supporting pupils to create, question, and engage online in ways that build rather than divide
- Working with parents. Many parents are not aware of these online subcultures. Sharing information through newsletters or workshops helps to bridge this gap
- Training pastoral and safeguarding teams to understand the coded language, trends, and warning signs
The Government’s Prevent strategy and guidance from the UK Council for Internet Safety (UKCIS) also recommend proactive education to build resilience to extremist narratives, not just reactive measures once harm has occurred.
It’s easy to laugh off memes. They’re designed to feel silly, chaotic, and trivial. But for some young people, especially those who feel angry, isolated or unseen, these memes can become stepping stones into darker spaces.
Teachers can’t control what young people see online. But they can help them understand it and offer safer, more honest spaces where they can explore who they are, without needing to ‘belong’ to something harmful.
Knowing what to look out for is the first step.
SSS Learning
16 June 2025