Dr. Jessica White is a Senior Research Fellow in the Terrorism and Conflict group at the Royal United Services Institute (RUSI) in London. Her expertise encompasses countering terrorism and violent extremism policy and programming, with a focus on gender mainstreaming strategies. She has over a decade of experience, both as a researcher and as a security practitioner in the United States Navy. Jessica leads RUSI’s Far-Right Extremism and Terrorism research program and is a co-founder of the Extremism and Gaming Research Network. She has published on a range of topics, including gender in security, far-right extremism, and terrorism in the media.

Galen Lamphere-Englund is a senior research and strategic communication consultant at the nexus of violent extremism, conflict, and tech issues. For over a decade, he has examined how various forms of radicalization can lead to violence and how to foster resilience to societal divides. Galen has led global research and programming in over 30 countries for United Nations agencies, governments, humanitarian agencies, think-tanks, and many of the largest tech platforms. Galen is a co-founder of the Extremism and Gaming Research Network and advises a range of clients on how to prevent the exploitation of online spaces and ICT by extremist and terrorist actors. He is also an Associate Fellow at RUSI.

CTC: Tell us about the Extremism and Gaming Research Network (EGRN). How did it come about, and what is its mission?

Lamphere-Englund: Jessica and I are co-founders and co-conveners of the Extremism and Gaming Research Network, which started a little over two years ago as a practitioner- and researcher-led initiative to try and unpack concerning developments we’d seen in the online space.

My background is as a conflict and terrorism researcher, and I’m also a gamer. I have been my entire life. I started to see those two worlds colliding more and more, and seeing significantly more violent extremist content in gaming spaces online. I sent around a note to a lot of other colleagues in this space and said, “Look, are you seeing something similar? Do you know anyone who’s also working on this intersection?” That’s when Jessica and I met, and we started putting together a wider group of practitioners to try and figure out: What kind of interventions can be programmed in this space? And on the research side, what kind of data do we currently have? And more importantly, what research gaps do we need to try and fill globally?

In the over two years since, we’ve put together a research agenda that we’ve gradually hacked away at through our different members, and we’ve managed to learn quite a bit about the scope of this problem, though there’s still certainly more to be done. That’s where we’re at now. We now have monthly meetings with over 100 invitees and roughly 60 formal members, and [are] continuing to grow quite rapidly.

White: People were, in an ad hoc way, realizing that the online gaming space was something that needed to be discussed more, but we were the first group put together to really get to this issue of extremism and violent extremism in the online gaming space. We originally were largely research institutions and individuals working on researching violent extremism and terrorism and counterterrorism. As the network has grown, we’ve added in a wider variety of people. We’ve added in policymakers that are asking questions about how they can address extremism. We’ve added in people from the gaming industry who are also asking questions about what they should be doing. We’ve added in security practitioners who are concerned about this space. And we aim to make connections to other networks that are working on online gaming safety, including through bolstering research and ‘safety by design’a practice, etc.

CTC: Who are the partners in your network? Who does EGRN work with?

Lamphere-Englund: It’s pretty wide. We are partners with a number of research institutions, so groups like the Royal United Services Institute, the Institute for Strategic Dialogue, etc. We also work with a lot of university partners—the University of Sussex in the U.K., for example. We have a lot of PhD researchers who come and join us. Then we have the Global Internet Forum to Counter Terrorism (GIFCT),b which is an industry body. We are partnered with the Global Network on Extremism and Technology,c which is also GIFCT-related but housed at King’s College in London, and Tech Against Terrorism.d On the law enforcement side, we work with a kind of eclectic mix of people because it’s based on interest. We have local beat cops who are designing ‘cops versus kids’ Esports initiatives1 all the way up to informal partners who will come in from fusion cells and from national CT and intelligence centers as well. We span the gamut, as long as there is a genuine interest in exploring the space and developing solutions for it.

White: It’s interesting to point out that the policymakers come along because they want to know what to do about legislating: What kind of considerations need to be made in relation to regulatory efforts and content moderation questions. The police and the intelligence services often come along because either, one, they’re interested in knowing more about the harms and investigating how deep this issue of extremism in the online gaming space goes and what the legal and ethical parameters are around their intervention, if they’re trying to investigate in that space. Or two, like Galen said, there are a lot of police departments that now use online gaming as a positive outreach tool to reach out to groups in the community to build that bond of trust through positive gameplay engagement or to use it as a positive mentorship tool to encourage behaviors—almost as a P/CVE [preventing/countering violent extremism] tool in a lot of cases.

CTC: As a starting point, can you describe how big gaming is? And who’s participating in gaming?

Lamphere-Englund: Gaming is huge. It’s colossal as a space. Right now, there are around three billion people who game across the world, so more than one in four humans are gamers. Now, that’s a slightly misleading stat because that includes all of the mobile gamers who just play games on their phones. Generally speaking, in terms of our area of interest, we’re more focused on games that have communicative abilities inside of them—online games that are multiplayer, generally speaking—so that’s a bit of a smaller audience. We’re still talking a massive swath of people across the world.

It’s no longer a male-dominated space in terms of gamers themselves. When it comes to studios and game design, that’s a different story. But about half the gamers in the world are women now, and increasingly they’re everywhere, in every continent. About half the world’s gamers are in the Asia Pacific region; that’s probably the highest per capita in terms of where gamers are. But the fastest-growing audiences are in MENA [Middle East and North Africa] and Latin America at the moment.

Who participates in gaming? It’s everyone really. Obviously, it’s concentrated in terms of younger audiences. When we think about prime recruitment demographics for violent extremist organizations, terrorist groups, and armed groups globally, that core 14- to 24-year-old age range still constitutes one of the largest share of gamers.2 There are a lot of older gamers now, too, but the majority are still in that slightly younger demographic.

White: Gaming is a huge industry. The value of the industry is more than other media—for example, movies and others you might think actually make more—but gaming makes more money than all of those.

Lamphere-Englund: $200 billion last year in revenue. That’s more than movies, TV, music, all of it. Gaming is incredibly lucrative and incredibly profitable.

CTC: Turning to the gamers themselves, what are the at-risk groups that you’re really focusing on? We know that gaming can attract isolated individuals. And then there is the presence of online, often anonymous chat groups. How can the online chatting aspect impact gamers?

White: It’s important to remember, as we are discussing the potential for extremism and violent extremism to be spreading in this space, that the majority of online gamers use it as a very positive engagement tool. People find positive communities in this space; there are gamer communities out there raising money for great causes. I think this question of at-risk groups is a really interesting one and a complicated one. We’ve seen over the last 20 years of counterterrorism efforts that it’s really difficult to pinpoint at-risk groups. There’s no profile for someone who might go out and commit a terrorist act. The radicalization process is usually a very complicated, back-forth, here-there process of stepping to and away from and back towards extremism and violent extremist content.

The online gaming space provides a myriad of subcommunities, and to really investigate how these different subcommunities are interacting with each other, we need more research, because there isn’t a lot of research on the scope of how deep this issue [violent extremism] goes into the online gaming space. But we know from anecdotal evidence and incidents that have happened, things like Gamergate,e that the online gaming space tolerates and sometimes encourages sexist language, sexist attitudes, racism—a lot of the -isms that we are worried about that can lead to extremism and violent extremism. These dynamics are part of the culture of some of these gaming subcultures.

What is difficult to pinpoint [is] that this group or this game has a problem. I think it’s a pretty widespread issue. It’s a reflection of society because so many people now play games. Gaming is such a transnational gathering place, where you do get expressions of hateful ideas that people hold and bring into their games and game chats with them, and exposure is quite widespread. So, there’s an interesting challenge in looking for at-risk [groups], looking to see if we can discern if there are certain gaming narratives that perhaps allow extremists to pick it up and use it as part of their ideology, or certain games where people gather around a game and allow hateful content into their spaces. But it’s something that certainly needs more investigation before we could really tell you whether there are those specific types of indicators that we can pick up on.

Lamphere-Englund: Agreed. Apart from focusing on individuals, we have tried to break out of a typology of use cases and of exploit cases by extremist actors. We’ve tried to go a bit in the opposite direction: What are the potential use cases for harm and exploits that occur rather than focusing on individual profiles that we haven’t really been able to have much success with across this entire field, let alone in gaming per se.

The general six categories that we think are important to focus on are:

One, the production of bespoke or custom video games and ‘mods,’ or modifications, and those are really used for recruitment and retention purposes.

Two, gaming culture references. Gaming is pop culture now. Where you might have seen far-right extremist groups using MMA [mixed martial arts] and fighting groups as a popular call [or] pop culture reference 10 or 15 years ago, gaming works for that now. That’s more the utilization in propaganda for recruitment.

Three, the use of in-game chat functions, but that’s actually directly used for organizing, for communicating, and for grooming or indoctrinating users as well.

Then, four, there’s this broader amorphic category of gamification, and that’s using game-like elements outside of games. Think of going to your coffee shop, and you swipe a card and you get points back and you ladder up those points against other users. That’s a gamified approach to marketing. We see extremist groups using gamified approaches, both strategically and organically.

And then, five, there’s also the broader gaming-adjacent sphere of a lot of platforms like Discord, Twitch, Steam, Reddit, places where gamers congregate online that are not specifically games but are very popular with gamers and draw on gaming cultural references. Those spaces are also used for a whole host of different purposes.

Then lastly, there is the financing potential, too, where we see in-game items, avatars, apps, game keys that can be sold. And we know that money laundering occurs in these spaces, and there’s clearly a very strong potential for at least money laundering if not direct terrorism-related financing to occur through some of those spaces. That’s the general landscape of different harms, and that gives us a way to focus in on, ‘Where are these harms occurring, and what individuals are impacted by them?’

Jessica White

CTC: We know that a range of nefarious groups and actors can and do use gaming to spread propaganda and spread their influence, from far-right extremists3 to the Islamic State.4 Are there groups or ideologies that are most concerning in terms of their effective use of gaming platforms to spread their beliefs?

White: I would say that it reflects the ideologies that are most prevalent in society. In the North American and the Western European context, you see a lot of far-right ideology being spread in the online gaming space. But if you looked to Southeast Asia or the Asian context, you would see Islamist extremism being the most prominent, or Buddhist extremism or whatever other types of extremism are the most prominent in their social space being reflected into the online gaming space. This goes back to what Galen was saying about the different ways you can think about the concerns of the online gaming space, but this socialization element is where we see the most potential for people to be either exposed to violent extremist content or ideology or potentially recruited. It’s a reflection of the actors that are most prominent in their cultural environment, their language environment.

There are certainly some violent extremist groups that have produced their own games, but this is more of an effort to communicate to their followers. They’re not big games. They’re not recruiting a huge following to play these games. It’s more of a communication tool, an engagement tool with people who are already interested in being engaged with those groups. But where you see the wider spread of horrible ideas and horrible ideologies—for example, white supremacy or neo-Nazism in the case of the North American/Western European context—is often when people modify existing, popular games and put that content into existing platforms. You see it on Roblox, for example. There are many other examples, but you can play as a Nazi prison guard on a Roblox game. And Roblox is intended for children. It’s really easy to manipulate some of these games to insert that hate.

CTC: With modified games, like the Roblox situation you described, is that intended to recruit individuals?

Lamphere-Englund: Not really, from what we can see. The modified games generally seem to be approaching people who already have a point of ideological entry into whatever external ideology that is. There was a case recently in Singapore, for example, where two kids made their own servers in Roblox that were modeling ISIS fight scenes, but they had already started down a radicalization pathway before that.5 And then they made this online server themselves, similar to the Nazi prison guard example. There are a couple examples from Germany looking at very similar set of radicalization cases. Most of those kids were also exposed to other content previously and already had an interest in that ideology before they designed their games.6 So usually those ‘mods’ are more preaching to the converted or to people who are already interested, and it’s trying to maybe further them, but it’s not usually the first entry point. Exposure is a little different. It’s not usually modified games. It’s more in-game chats or livestreaming settings versus a modified game.

White: One important element to point out here is that because there’s so much [of] people bringing along their everyday hate, their everyday racism, their everyday sexism, whatever it is to these games, it almost becomes a question of resilience in these communities. How high is the level of resilience to this exposure? Because it’s going to be there. You’re going to find it. The minute you start looking for it, you will find it. It can seem pretty benign, but it’s racist or sexist or whatever it might be, and as people become receptive to that kind of language, if people start showing positive inclination towards that type of content, then you start to see the recruitment happening.

If somebody was there to try and recruit people into an ideological perspective or into a group or a network, then they might pick up on this. As people become more positively receptive to racism or sexism or whatever it is, then they might then pull them out of this group gaming/online chat experience they’re having and into a more closed environment where the conversation then might become more ideologically charged. And they might be pulled into basically more and more closed, dark corners of the internet as they show interest in going towards an ideological perspective. So the main concern about the online gaming space, at least from our perspective as a network, is this wider exposure to hate and how that can then encourage some people into acceptance of violent extremist ideology.

CTC: Earlier, Galen, you mentioned ‘strategic’ and ‘organic.’ Linda Schlegel has described “top-down gamification” versus “bottom-up gamification,” where “top-down gamification refers to the strategic use of gamified elements by extremist organizations to facilitate engagement with their content”—so a way to very directly get propaganda and other material in front of individuals—and “bottom-up gamification emerges organically, and online communities or small groups of individuals radicalizing together.”7 So again, what you spoke about with in-game chatting and so forth. Do you agree with that differentiation? Is that helpful when you think about this problem set, and if so, which in your opinion is more difficult to contend with?

Lamphere-Englund: Linda is wonderful. She’s one of our founding EGRN members, and so it’s always great to hear her work come up. She’s done some really wonderful thinking on this for longer than most people have, which is excellent. Yes, I generally agree with that premise. I think it’s a useful heuristic. I’m working on a kind of typology that builds on that at the moment.

In terms of organic use, I think there are two helpful subcomponents inside of that, not to overly categorize things, but I think there’s this use of social spaces, number one. Jessica, you were speaking to this a moment ago: People bring along their own biases, their own hate when they play games. So organic use to me is also, look, across the community of billions of gamers, there’s going to be some pretty nasty extremists who have really hateful views. You can look at the Anti-Defamation League surveys in the U.S. last year where they found 20 percent of adult gamers have been exposed to white supremacist ideologies in online games; that doubled year on year.8 Does that show that games are the epicenter of that, or does that show a broader societal shift? Well, if you look at polling across the U.S. in general, that seems to reflect broader societal shifts at the same time. Maybe it’s a bit higher, maybe it’s more concentrated, but that’s also because what resonates online tends to be the most inciteful and the most outrageous type of content. So there’s that social aspect. Also, people like to game together.

Small group dynamics of the type that you will be very familiar with at West Point in looking at military small group dynamics—the type of group bonds that make a squad really effective on the battlefield—play out similarly in online spaces. If we look at gaming together in a “raid instance” to fight together in a small team, you’re engaging in small group dynamics. At the same time, those small group social bonds that make squads effective in battle and as cohesive units also confer radicalization risks. If one member of your group becomes radicalized, that’s actually very likely to then spread to other members of your group.

The other organic side that I think is interesting is identity fusion, which is something that one of our colleagues, Rachel Kowert, has looked into,9 and that’s drawing on the broader psychological research studies of radicalization. [It is] looking at the internalization of a group identity when that overwhelms all the other layers of your identity, and [it] becomes much more likely that you’re going to be susceptible to radicalization. That also holds true for gamer identities. When a gamer identity becomes the primary identity and becomes completely internalized, you’re much more likely to do things to advance that group. What Rachel has found with her work is that that can lead to a whole host of other negative behaviors and practices: willingness to fight or die, aggressive behaviors, Machiavellianism, narcissism, sexism, racism, and endorsement of extremist beliefs.

Strategic use, we definitely see as well, where VEOs and terrorist groups have tried to actually weaponize and exploit games, and that’s kind of that broader six-part typology, specific use cases and exploits discussed earlier.

Galen Lamphere-Englund

CTC: Returning to online chat platforms, what are the impacts of apps like Discord, which can offer a degree of anonymity, on individuals that could be exposed to violent extremism through them?

White: The chat spaces are often where the socialization processes are happening. Academics have rehashed the issue of ‘do violent video games cause violence’ for decades now, and EGRN definitely takes the line of it’s not the video games themselves that are the problem. It’s not the violence of the video games, but it’s these environments that exist in these online chat spaces where that’s where you form your gameplay community. Our colleague Dr. Rachel Kowert always says that the online gaming space is a world-building experience, that you’re making something together, you’re achieving a goal together. Rachel is a psychologist, and she’s studied the psychological effects of the adrenaline, the feeling of connection, and she has found that the online experience of achieving a goal with your group is actually very similar to the real-world experience.

So the platforms themselves aren’t the problem, but it’s the socialization that occurs on those platforms and the connections that are made. They can be very positive in nature; they can be great communities for people to be a part of. But they can also be negative. If you have entered into a space in which there are parts of your group in this chat environment—where you’re playing games together, you form that bond with them in your gameplay experience—that then opens you up to hearing about their ideas about the world, and if those people happen to have hateful or extreme ideas about the world, and you’re now engaged with them in this gameplay and you feel that peer group attachment to them, then you might be more susceptible to their ideas and willing to take on the extremist nature of their ideas. People go to these platforms and start talking to each other because they’re talking about the gameplay, but really the conversation about everything else is actually more what we want to look at.

Lamphere-Englund: I completely agree, and it’s also interesting to think you have an audience funnel here. In stratcom [strategic communication] and marketing work, we think about how [to] bring in audiences to a conversion point where you actually shifted them on their ideas—whether that’s trying to convert them to buy something or you’re trying to convert to them ideologically.

Where does that happen? If you think of it in terms of games, maybe that first entry point is an in-game chat where someone makes a racist or a sexist or a misogynistic comment in an in-game chat; they look for reacts. From there you can start a DM [direct message] with that player. You can either have an audio chat with them or you can chat with them online. You say, ‘Hey, you look like a cool guy. Let’s talk a bit more about that.’ Then you bring someone to a wider service. So you might bring someone to Discord, for example. Discord is not fully anonymous necessarily. It’s pseudo-nymous. There’s actually identifiable data that links the username to a real identity. It’s just not shown publicly in that server, compared to like 4Chan, which can actually be fully anonymous. Inside of Discord, you have those chats that are happening. Discord can be used for organizing, but then you also can bring it to another level down and have a chat with someone on a fully encrypted platform. So you bring them to Signal or to a Telegram group. The communication platform choice depends on the degree of anonymity or discretion required. So online chat rooms show how that can happen in a broader socialization sphere. And then specifically we can see how different channels are used for different types of communication and organization purposes, too.

CTC: Do you still see Discord as that platform for discussion, or more often do you see gamers jumping straight to 4Chan, for example?

Lamphere-Englund: Discord is interesting. They’ve done a lot of work to try and improve their content moderation policies and their trust and safety work, and they’re probably doing more than the vast majority of platforms out there. I think they’re really trying to tackle this as a problem set. That being said, Discord is absolutely still used because it’s a massive space. We’re talking about millions and millions of users who can create specific community servers. It’s that creation of a community space that makes it really helpful for all manner [of activity], whether that’s trying to do your homework or discussing a specific game or talking about organizing a white supremacist cult that wants to go out and kill people. So you have all these different types of communities that can be imagined and exist. And that’s both the strength of it as a platform and why it also gets exploited.

White: It’s useful to note that a lot of the violent extremists are very clever at knowing what the rules of engagement are. They know when to move to a darker corner of the internet, a more closed space that has less moderating. They know how to operate within the terms of service of these platforms and how to edit the conversation just right in the text so that the content moderation doesn’t pick it up. The Trust and Safety Teams of these companies, who are not necessarily all violent extremist experts, are a few people working against a tide of hundreds if not thousands [or] however many people, and trying to be effective in addressing these problems while their counterparts are very clever at moving around and moving into spaces where it’s easier for them to get around moderation efforts.

CTC: With both the Christchurch, New Zealand, attack in 201910 and the Buffalo, New York, attack last May,11 the perpetrators livestreamed their attacks. And there have been gamified elements present in other recent attacks.f It’s one problem set to try to eliminate extremist content on the platforms, but how can stakeholders work to prevent elements of gaming manifesting offline, in the form of terrorist attacks? Is that beyond the scope of what platforms are responsible for or capable of mitigating?

White: I think it kind of is in a sense. Going back to Linda’s categorization of gamification, that top-down element of gamification: Gamification is a tool. It’s not positive or negative in and of itself. Marketing companies have been wielding gamification to sell you products for decades, and they have a huge body of historical research on how to use gamification positively. There are policymakers now concerned about, ‘what do we do about gamification, livestreaming of attacks, and that gamified element of attacks?’ But you can’t legislate against gamification. It is a tool. So in a sense, it is somewhat beyond the scope of what can be addressed by gaming companies or by legislation with content moderation-type efforts.

On the other hand, I think the issue is that this element of awarding points to go out and kill people or whatever it is, that element of using gamification to make engaging with horrible, hateful activities a fun experience for people who want to do that, that is something where it’s reflecting a wider social acceptance of these hateful ideas. It wouldn’t spread if people weren’t clicking on it and wanting to look at these manifestos and the videos of these attacks. It’s a reflection of a wider problem, I think. So, it’s perhaps outside the scope of what gaming companies can address, but it’s something that should be addressed as part of this question of what do we do about the mainstreaming of extremism in society as a whole and why people are enticed into getting points for going out and killing people or whatever it is.

Lamphere-Englund: I think that’s very true. While it is not incumbent on tech platforms to prevent all forms of violent extremism and misuse of their platforms, there are some steps to be taken. Twitch, for example, after it was used in the livestreaming attack in Halle, Germany,12 designed new image recognition technology to try and spot and take down attacks more rapidly. So when Buffalo happened, the live video went down within two minutes.13 Now, that content was then reposted online and still received millions and millions of views, but the actual technical response improved substantially. So while technical solutions are not going to get us out of this problem, there are still internal innovations that can happen.

[What] we try to do as a network is promote those conversations between platforms and teams and regulators and policymakers to say, ‘Here are promising practices that maybe you all should share rather than just silo them.’ Because there’s not necessarily an incentive to share across different companies, even though the individuals there might be interested in doing it. So trying to cross-pollinate some of those good practices and technical solutions can be helpful, too.

CTC: From your perspective, who should ultimately be responsible for keeping gaming safe? Should it be the developers who control the games and profit from their use? Governments that have an interest in public safety? An international organization, as gaming is borderless in many regards? Or should it be the end user and that mitigation ultimately comes down to the players themselves?

White: It is a tricky question. I think it’s a little bit of everyone. Everyone must get together and be involved in making it a safe space. It’s massively cross-geographical, cross-language, cross-identity factors, cross-government. It’s a space that is used by people around the globe of all ages. So, I think everyone has a little bit of responsibility in making it a safer space, just like everyone has a responsibility in making social spaces safer. There are definitely things that gaming companies can do. ‘Safety by design’ is a phrase used in the gaming industry to indicate their efforts to mitigate illicit or harmful activities in their games or effects of their games. I think that part of the industry can become more aware of concerns around violent extremism and think about the narratives of the games and how they’re played.

The question around content moderation: It is not a whole solution, but it is a useful tool and governments can help to increase emphasis on content moderation efforts and ensure that it’s happening across the industry rather than [just] within a few of the large companies wanting to do it and positively engaging with it, while there are still some resisting and not wanting to engage with it. You see varied interest from the gaming companies as to whether they want to engage in conversations around this, whether they’re interested in trying to address violent extremism on their platforms. So, I think government can be useful in enforcing a more even approach across the companies.

But it’s also about end users, like you say, and the gaming communities policing themselves and policing the behavior in their communities and not allowing that kind of hateful content to be encouraged or allowed in their spaces. Ultimately, that is going to be that piece of the puzzle that has to fall into place, because until you start encouraging these communities to police themselves or encouraging them not to buy games because [perhaps] that game is bringing a harmful element into their community, then it will continue to be sort of an ad hoc approach to addressing it.

Lamphere-Englund: Agreed. I think it’s an eminently cross-sectoral solution that’s needed, and there needs to be a lot more synchronization across actors. That’s something we really believe in and try to practice. On the regulatory side, the U.S. is behind. If you look at NetDZ in Germanyg or the Online Harms Bill in the U.K.h or the DSA in the E.U.,i we’re seeing a lot of nascent efforts to try and figure out how to grapple with online harms. None of them are perfect, but in the U.S., we’re not really even at that point yet. There is a regulatory question: That sometimes that gets it wrong and can be improved. There’s the platform side: That needs to be encouraged and hopefully work on regulation in conjunction with government to actually figure out some of the nuances of these platforms, especially for smaller ones. And then there’s that assistance aspect: Groups like Tech Against Terrorism really try to help smaller platforms understand what harmful content is present and how they can do something about it.

And also parents and educators are massively important here, too. With younger generations, there’s no difference between the online world and the offline world. It’s just their world. So as a parent or as an educator, you should also be looking at that part of kids’ worlds. That’s a really important element for keeping folks safe online.

CTC: What is the potential of P/CVE gaming?

White: I think it’s difficult because often P/CVE programs are small-scale with small budgets and a small amount of time [for their development], and to develop a game in that space that’s actually going be picked up and played widely is challenging. I do think where you see, for example, police departments that have these programs like ‘cops versus kids’ to engage with communities through online gaming, that’s a really positive use of the tool. So, to use Esports or whatever it might be as a platform for your engagement with other communities, [while] it’s not exactly P/CVE all the time, it’s a really positive use of the online gaming space.

When it comes to P/CVE as a type of programming, practitioners could become more familiar with how to ‘gamify’ their efforts and make it more engaging and more fun for the communities that they’re trying to reach out to. We could learn a lot from the marketing industry, from the ones that have been using this and from the educators that have used this in schools to make education interesting. There are tools of gamification that could be useful to the P/CVE world.

Lamphere-Englund: Agreed. Programs I have worked on before use different influencers as well to talk about social issues or talk about P/CVE issues—using Esports influencers or gaming influencers to talk about, in more soft terms, exposure to extremists or harmful content online and pushing back against it. We’ve had some colleagues who have been working on online bystander effect questions: How do you use mentorship to try and actually change how people react when they see harms happening online? There is space there to engage. And game-based learning is also effective at changing behaviors and skills. Games are actually really beneficial, if you can get them in front of people. So using games for P/CVE in educational settings can be really beneficial.

We’ve seen a lot of promise in using games to counter disinformation. So when you use them in the right setting in the right way, they’re helpful, but just investing money blindly into building a game and expecting that someone’s going to use it, no. Think of your audience: How are you going to reach the audience? If the game is the right way to do that and you have a way to get that in front of people, it’s an effective tool, but it’s really only one tool in a pretty wide space.

CTC: If the relevant stakeholders fail to adequately address extremism in the gaming community, what do you think or fear will happen in this space over the next few years? Are there any possible solutions to help mitigate this that they should be taking advantage of?

Lamphere-Englund: Gaming is one window into the future of social reality for humans. Reality is becoming increasingly distorted and fused with virtual aspects. Arguably what happens in virtual reality is every bit as real as what’s happening in the offline world, and that distinction will become blurrier and blurrier and blurrier until, I think, it will completely go away, even for folks who engage online very little. If gaming shows us a window to the future, if we fail to grapple with the fact that extremism is becoming really present in quite a number of online subcommunities and gamer subcommunities, then the costs for not really addressing harms in the future grow higher and higher.

I think the P/CVE and CT spaces generally missed the social media wagon tremendously; they missed a lot of online socialization problems by five to 10 years. The gaming sector right now is behind social media platforms when it comes to addressing all manner of online harms. Now, if we can help the game sector to catch up, that also will help our future realities to be hopefully a bit more safe, inclusive, and fun. So if we can address some of these problems now, we’re setting ourselves up significantly better for the next decade, two decades, three decades. If we don’t invest in understanding these problems now and ways to successfully respond to them, I think we’re looking at a lot more toxic and potentially more dangerous future online and offline [activity] because that distinction is going to be much less relevant.

White: I think that’s the point: The gaming space reflects the social reality. There’s a wider social reality right now in today’s world—post-COVID pandemic, with global crises, with political polarization—where there is a mainstreaming of extreme content. There’s discussion on mainstream news and in mainstream media spaces and mainstream politics of very extreme ideas and even extremist ideologies. That sort of mainstreaming of extremism, while it’s happening in society, it will continue to get worse in the online gaming space and vice versa. They reflect each other in many ways.

Lamphere-Englund: I believe there’s an opportunity. Coming from working on really intractable conflicts to working in gaming, I like gaming because there’s potential to actually move the needle a bit. There are interventions that work. There are ways that we can make these spaces a bit safer. There’s a potential to actually do something. Changing everything in broader society is really difficult. But can we change some things in gaming spaces? I think we can, and that’s worth investing in.     CTC

Substantive Notes
[a] Editor’s Note: “Safety by Design focuses on the ways technology companies can minimise online threats by anticipating, detecting and eliminating online harms before they occur.” “Safety by Design (SbD),” World Economic Forum, n.d.

[b] Editor’s Note: “The Global Internet Forum to Counter Terrorism (GIFCT) is an NGO designed to prevent terrorists and violent extremists from exploiting digital platforms. Founded by Facebook, Microsoft, Twitter, and YouTube in 2017, the Forum was established to foster technical collaboration among member companies, advance relevant research, and share knowledge with smaller platforms.” “About,” Global Internet Forum to Counter Terrorism, n.d.

[c] Editor’s Note: “The Global Network on Extremism and Technology (GNET) is an academic research initiative backed by the Global Internet Forum to Counter Terrorism (GIFCT), an independent but industry-funded initiative for better understanding, and counteracting, terrorist use of technology. GNET is convened and led by the International Centre for the Study of Radicalisation (ICSR), a globally renowned academic research centre based within the Department of War Studies at King’s College London.” “About,” Global Network on Extremism and Technology, n.d.

[d] Editor’s Note: “Tech Against Terrorism is an initiative launched and supported by the United Nations Counter Terrorism Executive Directorate (UN CTED) working with the global tech industry to tackle terrorist use of the internet whilst respecting human rights.” “About,” Tech Against Terrorism, n.d.

[e] Editor’s Note: Gamergate is a misogynist online harassment campaign connected to radical right backlash against diversity and progressivism within the online gaming space. It was a popular hashtag in 2014 and 2015, but continues to have impact today. For more on Gamergate, see Emily St. James, “#Gamergate: Here’s why everybody in the video game world is fighting,” Vox, October 13, 2014; Jay Hathaway, “What Is Gamergate, and Why? An Explainer for Non-Geeks,” Gawker, October 10, 2014; and Evan Urquhart, “Gamergate Never Died,” Slate, August 23, 2019.

[f] For example, the perpetrator of the deadly shooting outside a synagogue in Halle, Germany, in October 2019 livestreamed his attack on Twitch, a popular interactive video livestreaming service among gamers. Additionally, in his manifesto, he made references to his “achievements,” which is “typical of computer games. ‘Achievements’, i.e. tasks, are a way of comparing yourself to other players in the game.” The Poway, California, shooter’s manifesto, which he posted to 8chan, referenced a “high score.” Furthermore, “the first 8chan user to respond to Earnest’s attack announcement told him to ‘get the high score’ — videogame parlance by which white ethno-nationalists refer to the death toll of terrorist attacks. ‘Score’ has become an overriding obsession for many online observers since the attack in Christchurch.” See Oscar Gonzalez, “Twitch video of Germany shooting near Halle synagogue included anti-semitic motives,” CNET, October 11, 2019; Kira Ayyadi, “The ‘Gamification’ of Terror – When Hate Becomes a Game,” Belltower News, October 11, 2019; Bridget Johnson, “Another Synagogue Shooting: Manifesto Attributed to Poway Attacker Claims Christchurch, Pittsburgh as Influences,” Homeland Security Today, April 27, 2019; and Emerson T. Brooking, “Christchurch-Style Terrorism Reaches American Shores,” Digital Forensic Research Lab via Medium, April 30, 2019, respectively. For more on gamified elements in attacks, see Robert Evans, “The El Paso Shooting and the Gamification of Terror,” Bellingcat, August 4, 2019, and Linda Schlegel, “Can You Hear Your Call of Duty? The Gamification of Radicalization and Extremist Violence,” European Eye on Radicalization, March 17, 2020.

[g] Editor’s Note: “In 2017, Germany passed the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, NetzDG) (also called the ‘Facebook Act’). The law did not create any new duties for social media platforms but did impose high fines for noncompliance with existing legal obligations. The Network Enforcement Act is applicable only to social media networks that have 2 million or more registered users in Germany. It obligates the covered social media networks to remove content that is ‘clearly illegal’ within 24 hours after receiving a user complaint.” “Germany: Network Enforcement Act Amended to Better Fight Online Hate Speech,” Library of Congress, February 22, 2022.

[h] Editor’s Note: This draft legislation aims to “protect children online and tackle some of the worst abuses on social media, including racist hate crimes.” See “Landmark laws to keep children safe, stop racial hate and protect democracy online published,” UK Government, May 12, 2021.

[i] Editor’s Note: The Digital Services Act (DSA), a European Union regulation approved in October 2022, provides “a common set of rules on intermediaries’ obligations and accountability across the single market [that] will open up new opportunities to provide digital services across borders, while ensuring a high level of protection to all users.” See “The Digital Services Act: ensuring a safe and accountable online environment,” European Commission, n.d.

Citations
[1] Editor’s Note: For more on this initiative, see “Gaming used as a successful tool to build relationships between police and youth in Cops vs Kids pilot,” British Esports, November 4, 2021.

[2] Editor’s Note: “Essential facts about the video game industry 2022,” Entertainment Software Association (ESA), July 2022.

[3] See Peter Allen Clark, “Experts warn about rising extremism in gaming,” Axios, April 27, 2022; Aaron Tielemans, “A Survey of Violent Extremist and Terrorist Activities Across the Gaming Environment,” Global Network on Extremism & Technology, June 28, 2021; and “Gaming and Extremism: Extremists Evade Mainstream Restrictions in Corners of Gaming World,” Institute for Strategic Dialogue, September 24, 2021.

[4] See Edith M. Lederer, “UN says threat from Islamic State extremists remains high,” ABC News, February 9, 2023, and Rueben Dass, “The Link Between Gaming and Violent Extremism,” Diplomat, March 8, 2023.

[5] Editor’s Note: See “Two self-radicalised Singaporean boys given ISA orders; 15-year-old youngest to be detained,” Channel News Asia, February 21, 2023, and Jean Iau, “2 teens dealt with under ISA; 15-year-old student is youngest-ever detainee,” Straits Times, February 22, 2023. See also Dass.

[6] Editor’s Note: See Daniel Koehler, Verena Fiebig, and Irina Jugl, “From Gaming to Hating: Extreme-Right Ideological Indoctrination and Mobilization for Violence of Children on Online Gaming Platforms,” Political Psychology 44:2 (2022).

[7] Linda Schlegel, “Working Paper 1/2021: The Role of Gamification in Radicalization Processes,” modus | zad, January 14, 2021.

[8] Editor’s Note: “Exposure to White Supremacist Ideologies in Online Gaming Doubled in 2022, New ADL Survey Finds,” Anti-Defamation League, December 7, 2022.

[9] Editor’s Note: See Rachel Kowert, Alexi Martel, and Bill Swann, “Not just a game: Identity fusion and extremism in gaming cultures,” Frontiers in Communication, October 17, 2022.

[10] Graham Macklin, “The Christchurch Attacks: Livestream Terror in the Viral Video Age,” CTC Sentinel 12:6 (2019).

[11] Amarnath Amarasingam, Marc-André Argentino, and Graham Macklin, “The Buffalo Attack: The Cumulative Momentum of Far-Right Terror,” CTC Sentinel 15:7 (2022).

[12] Editor’s Note: Melissa Eddy, Rick Gladstone, and Tiffany Hsu, “Assailant Live-Streamed Attempted Attack on German Synagogue,” New York Times, October 9, 2019.

[13] Editor’s Note: Brian Stelter and Sharif Paget, “Twitch says livestream of Buffalo mass shooting was removed in less than 2 minutes,” CNN, May 15, 2022.

Stay Informed

Sign up to receive updates from CTC.

Sign up