Experts are raising the alarm about the rise of extremism in popular online games and how easy it can be for children to stumble upon hateful or violent content.
A 2023 report from New York University concluded that “extremists are exploiting online gaming and gaming-adjacent sites to promote hatred and violence.”
“The features of these sites are such that they are particularly useful to people who would happen to spread bad ideas or try to recruit people into conspiracies or even particular acts of violence,” Paul Barrett, a deputy director of the Center for Business and Human Rights at NYU’s Stern School of Business, told “Good Morning America.”
The FBI told ABC News online platforms such as gaming sites are one of the biggest challenges the federal agency faces in its efforts to counter violent extremism.
The FBI said those platforms “have all contributed to the increased speed, dissemination, efficiency, and accessibility of violent extremist content.”
Julia Ebner, a senior resident research fellow at the Institute for Strategic Dialogue and a researcher of extremism, said hobby groups like gaming communities are specifically being targeted as a recruiting tactic.
“Increasingly, you see minors and even school kids being lured into these spaces and not realizing what is happening to them, that they’re actually slowly being radicalized toward neo-Nazism and or toward white supremacy,” Ebner said.
One way children can be influenced, according to Ebner, is through “mods” — popular games that can be modified by users to include hateful content.
Ebner showed “GMA” one mod of the popular video game series “Call of Duty,” published by Activision.
“This is now a modification of ‘Call of Duty’ where the players can choose to play on the side of the Nazis,” Ebner explained.
Activision declined to comment to ABC News about such mods.
Similar content can be found on other online platforms like the social and messaging app and site Discord, where users often go to discuss games.
Experts say there are steps parents can take to help protect their kids. Both parents and children can sit down together to explore online spaces and test out safety controls. In some cases, parents can also turn off the option for kids to communicate with other players online or block specific games on a child’s device if it’s not age appropriate.
Online extremism can have real consequences. According to the New York Attorney General’s office, the 18-year-old behind the racially motivated May 2022 mass shooting at a Tops Friendly Market in Buffalo used Twitch, an online platform used by video game streamers, to broadcast the attack, which left 10 Black people dead.
Twitch took the livestream down within minutes and released a statement condemning the violence at the time.
“We take our responsibility to protect our community extremely seriously, and trust and safety is a major area of investment,” Twitch said in its statement, adding that it would examine the attack and committed to “sharing those learnings with our peers in the industry to support a safer internet overall.”
Twitch, Discord and the trade group Entertainment Software Association have individually told ABC News they are committed to combating hateful content and taking a multifaceted approach to meet challenges, including banning users, developing technology to identify hateful content and working with law enforcement.
But some experts say leaving the industry to address problems may not be enough.
“We need regulation,” Barrett said. “Greater publicity and public understanding about what’s going on has the potential to create pressure on the companies to do what they should be doing on their own.”
Copyright © 2023, ABC Audio. All rights reserved.