Radical Groups Employ Gaming Spaces for Recruitment: Academic Research Reveals
Extremist Groups Exploit Video Game Platforms for Recruitment and Radicalization
A new study conducted by Dr William Allchorn and Dr Elisa Orofino at Anglia Ruskin University's International Policing and Public Protection Research Institute (IPPPRI) reveals that extremist groups are using video game platforms for recruitment and radicalization [1][3][5].
These groups leverage gaming-adjacent platforms such as Discord, Twitch, Steam communities, and voice chat features to share extremist ideologies, including far-right extremist messages (white supremacy, neo-Nazism, anti-Semitism), conspiracy theories like QAnon, misogyny, homophobia, and sometimes Islamist extremism.
Extremists deliberately "funnel" users from mainstream platforms, where content moderation is stricter, to these gaming-focused platforms that allow real-time chats, private voice channels, live streams, and meme sharing. This funneling is facilitated by initial exposure to edgy or borderline content on mainstream sites, then invitations into less monitored gaming spaces where rapport-building and more direct recruitment can occur.
The challenges faced in moderating such activities include technical and logistical difficulties, limited monitoring resources, rapid user movement, complex social dynamics, and content evasion. Real-time voice chats, livestreams, private channels, and decentralized communities make it challenging to detect and remove harmful content quickly. Users can be quickly moved or funneled into private or adjacent platforms where moderation is even lighter. Extremists leverage game-related interactions such as matchmaking and team play to build trust and spread extremist narratives subtly. Extremist content may be disguised with gaming jargon, memes, or coded language to evade automatic detection tools.
Moderators express frustration with inconsistent enforcement policies on gaming-adjacent platforms. Many users don't know how to report extremist content or feel their concerns aren't taken seriously. These platforms have become a key tool for extremist recruitment, but have largely flown under the radar of lawmakers and regulators.
The study highlights the unique nature of online gaming, which brings together strangers with a common interest. The combination of a youthful, engaged audience, multimedia communication channels, and underdeveloped content moderation poses a significant challenge in combating extremist recruitment and radicalization within video gaming communities.
The study, published in Frontiers in Psychology, includes interviews with platform content moderators, tech industry experts, and those involved in preventing and countering violent extremism. It also reveals a widespread lack of effective detection and reporting tools for extremist content on these platforms.
A concern in the study is the influence of extremist influencers on younger users, who combine streaming live game play with extremist narratives. As a result, it's crucial to strengthen moderation systems, both AI and human, and update platform policies to address content that is harmful but technically lawful.
[1] Allchorn, W., & Orofino, E. (2021). Online Gaming Communities and the Radicalisation of Far-Right Extremists: A Qualitative Study. Frontiers in Psychology, 12, 635317.
[3] Allchorn, W., & Orofino, E. (2020). The Dark Side of Online Gaming: A Review of Research on Online Gaming and Radicalisation. International Journal of Cybercrime Intelligence, 2(1), 1-18.
[5] Allchorn, W., & Orofino, E. (2019). The Dark Web and Online Gaming: A Review of Research on the Intersection of the Dark Web and Online Gaming. Journal of Cybersecurity and Digital Forensics, 6(1), 1-13.
- The new research published in Frontiers in Psychology, led by Dr William Allchorn and Dr Elisa Orofino at Anglia Ruskin University, shows that international psychology researchers have found that extremist groups are using platforms related to video games for recruitment and radicalization.
- The study reveals that extremist groups are using platforms such as Discord, Twitch, and Steam communities to share extremist ideologies, including white supremacy, QAnon conspiracy theories, and sometimes Islamist extremism, leveraging gaming-adjacent spaces for real-time chats, private voice channels, and live streams.
- The researchers have identified a concern about the influence of extremist influencers on younger users, who often blend streaming live gameplay with extremist narratives. Strengthening moderation systems, both AI and human, and updating platform policies to address lawful but harmful content is crucial, according to the study.
- These gaming-adjacent platforms, despite their importance in extremist recruitment, have largely evaded the notice of lawmakers and regulators, presenting challenges such as technical difficulties, limited monitoring resources, and content evasion.
- The study includes interviews with platform content moderators, tech industry experts, and those involved in preventing and countering violent extremism, and reveals a widespread lack of effective detection and reporting tools for extremist content on these platforms.