Regulate toxic online video games the same as social media
After years of paralysis, the U.S. and European Union have been moving to regulate social platforms.
The EU is taking tentative steps to enforce the new Digital Services Act, which came into effect for all online platforms on Feb. 17. Some states in the U.S. are trying to control how platforms do content moderation and the Supreme Court is now considering the constitutionality of this regulatory foray. Meanwhile, the U.S. Senate is edging closer to approving legislation that would compel online platforms to introduce additional child safety measures.
But these efforts to rein in social media overlook the gargantuan online gaming industry. Video games comprise the largest entertainment industry by far, grossing almost $200 billion in revenue in 2022 (more than the music and film industries combined). Today, online gaming sites offer far more than entertainment — they are social platforms that channel user communication and enable networking and community building.
Online games should be part of the regulatory discussion concerning digital platforms — but they seldom are.
Discussions of harms in gaming historically have focused on fears about the content of the games themselves. However, the social environment of games is what actually poses significant threats to players’ safety and well-being. Exposure to hateful speech is routine while doxxing — exposing someone’s private identifying information — and physical threats are also common. Of acute concern is mounting evidence that extremists and other bad actors take advantage of the preexisting toxic gamer culture to disseminate their hateful ideologies and unleash hate-based harassment.
While players, game makers and members of civil society are familiar with these problems, many companies and policymakers are only just beginning to address (or in some cases even acknowledge) these issues.
Gaming companies and policymakers seem to be operating under a misconception about how the rights to privacy and freedom of speech should apply online. The misconception stems from inadequate attention to the human rights on the other side of the equation — that of victims of extremist harassment and violence.
Hate, harassment and the perpetuation of extremist ideologies rooted in hostility towards specific groups stand in direct contradiction to the very notion of human rights, which are premised on the belief that all human beings have equal worth, regardless of their specific characteristics or group membership. In practice, exposure to extremist rhetoric and behavior in games has tangibly and sometimes devastatingly affected victims of harassment, intimidation and (occasionally) violence. In the most severe cases, extremist radicalization through gaming inspires actions that violate individuals’ right to life.
The perpetrator of the 2019 Christchurch massacre in New Zealand used gaming chat rooms to engage in and seek support for his racially motivated extremist ideation. His “gamification” of the actual attack inspired subsequent mass shootings, at Poway, Calif., El Paso, Texas, Halle, Germany and Buffalo, N.Y. to name a few.
In 2022, the gunman who killed 10 and injured three at a Tops Friendly Market in Buffalo noted that he “probably wouldn’t be as nationalistic if it weren’t for Blood and Iron on Roblox,” a popular online game.
In the last few years, we, along with other researchers, have spoken publicly about changes we would like to see in industry and policy as it relates to games and extremism, including calls to empower the industry toward self-regulation, open invitations to work together with companies, and frameworks for how to leverage the positive effects of video games to prevent radicalization.
However, action on these recommendations has been stymied by the assumption of privacy that is given to players in public game lobbies and servers. In our view, this amounts to a distortion of the right to privacy, and human rights generally, which should serve to protect not the perpetrators, but rather the victims of doxxing, stalking, threats to life and other abuse online.
Regulators should consider online games fair game for regulation. Indeed, it is unclear why games are typically excluded from the same regulatory approaches that apply to social media since online multiplayer games are social platforms.
Online games nominally fall within the scope of the Digital Services Act, and the regulation’s enforcers should ensure that game companies comply with its extensive requirements — including publishing human rights risk assessments, removing illegal content, countering disinformation and extremism, issuing transparency reports and providing data to vetted independent researchers. This law provides a significant regulatory push for game companies to put in place responsible business policies and practices.
Similarly, when considering online platform regulation and enforcement, U.S. policymakers — including legislators and federal agencies such as the Federal Trade Commission — should explicitly include game companies within their scope.
When talking about the safety of the internet, especially the safety of children on the internet, there is no more important place to start than digital playgrounds. Game makers, policy makers, researchers and advocates all share the common goal of making play as safe and enjoyable as possible.
It is time to open these conversations and stop ignoring the $200 billion elephant in the room when it comes to regulating social spaces online.
Mariana Olaizola Rosenblat is a policy advisor on technology and law at the Center for Business and Human Rights at New York University’s Stern School of Business.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..