Extremist groups are quietly using popular online games and connected chat platforms to recruit children and teenagers, researchers and security experts warn.
How the recruitment works
Studies show that white supremacist, far‑right, and other extremist activists are moving into gaming spaces such as Roblox, Minecraft, Fortnite, and Xbox Live, where young players spend hours chatting and playing together.
These platforms and “gaming‑adjacent” services that allow live streaming and chat, such as Discord and similar communities, are being turned into “digital playgrounds” where extremists build trust and slowly introduce hateful ideas.
Researchers say the process often starts with casual in‑game friendships. A player might compliment someone’s skills, join a team, or invite them to a private chat, then gradually introduce extremist memes, jokes, or symbols that seem harmless at first but slowly normalize violence and hate, according to the New York Times.
Some games even allow players to create custom maps or servers where extremist content can be embedded, such as symbols, racist dialogue, or violent scenarios.
Over time, some children are “funnelled” from mainstream platforms like TikTok or X into less‑moderated gaming and messaging apps where extremist content is easier to share.
Security agencies in Europe and North America report that children and teenagers now make up a rising share of terrorism‑related investigations.
One United Nations–linked counter‑terrorism trend report notes that minors as young as 12 and 13 are appearing in 20–30 percent of European counter‑terrorism efforts, and that exploitation of children through gaming channels is growing.
Analysts at the International Center for Counter‑Terrorism and other groups say extremist groups are deliberately targeting younger audiences because they are more impressionable and available online.
Warning signs and protective steps
Experts warn that extremist‑linked gaming servers and chat rooms sometimes host simulations of terrorist attacks or mass shootings, including recreations of real‑world events such as the 2019 Christchurch mosque attacks, The Guardian reported.
Some groups also use live‑streaming features to broadcast hate speech or to encourage violent behavior, all while avoiding or bypassing automated moderation tools.
In several documented cases, children have been radicalized after repeatedly playing with the same extremist‑linked users, who then deepen the relationship outside the game.
Parents and teachers are advised to watch for changes such as sudden use of hate speech, growing interest in conspiracy theories, withdrawal from family and friends, or a marked increase in time spent in fringe online communities.
Law enforcement and child‑protection groups recommend regular conversations about online safety, checking privacy and chat settings, and reporting suspicious contact or content to the game company or authorities.
Gaming firms like Microsoft and Roblox say they ban extremist content and use AI tools and human moderators, but they also urge parents to stay involved in what children are playing and who they are talking to, as per The Conversation.
Originally published on parentherald.com






