Tech
Activision Intensifies Efforts to Combat Toxicity in Call of Duty Franchise
Activision has made significant strides in addressing toxic behavior within the Call of Duty community, as the company prepares for the launch of its latest installment, Call of Duty: Black Ops 6. Known for years for its toxic player environment, Activision has initiated robust measures to curb such behavior, particularly in in-game voice chats and text messaging, with advancements in artificial intelligence (AI)-driven moderation.
One of the main technologies employed by Activision is ToxMod from Modulate. This AI-powered tool operates in real-time to detect and manage toxic speech, including hate speech, discriminatory language, and harassment. Addressing concerns about privacy, Activision emphasizes that voice chat monitoring is solely for moderation purposes and targets harmful interactions rather than specific keywords.
Dylan Collins, CEO of Modulate, stated, “We are excited to see our technology being implemented in such a large platform. The reduction in toxicity is an encouraging step towards improving the gaming environment for all players.”
Activision’s Disruptive Behavior team commented on the progress, “Voice and text-based moderation tools in Call of Duty don’t undermine the competitive spirit inherent to the game. Instead, they enforce the Call of Duty franchise Code of Conduct, focusing on reducing harassment and derogatory language.”
Since implementing enhanced voice chat moderation in June, the company has noted a 67% reduction in repeat offenders in voice chat-related infractions within Modern Warfare 3 and Warzone. July 2024 metrics revealed that 80% of players penalized for voice chat violations did not continue engaging in such behavior. Activision further reported a 43% decrease in overall exposure to disruptive voice chat since January.
With the anticipated release of Black Ops 6 on October 25, voice moderation capabilities will extend beyond English, Spanish, and Portuguese, to include French and German. This expansion aims to reach a broader audience and further mitigate toxic interactions.
To reinforce these efforts, Activision collaborates with Community Sift to manage text chat moderation across 20 languages, having intercepted over 45 million offensive messages since November 2023. Additionally, a new system has been introduced to examine reports of inappropriate usernames and clan tags effectively.
Research partnerships with institutions such as the California Institute of Technology (Caltech) and the University of Chicago Booth School of Business support these initiatives. These collaborations aim to refine tactics for identifying and combating disruptive conduct while promoting prosocial behavior in gaming.
Despite these advancements, maintaining a clean gaming environment presents ongoing challenges. Activision’s ongoing strategy includes improving in-game reporting mechanisms, fostering transparency with players, and enhancing fairness in moderation, striving to reduce toxic exposure for players and monitors alike.