Call of Duty to add AI to moderate voice chats, to empower positive gaming
The joint endeavour of Activision and Modulate signifies a significant step forward in cultivating an inclusive and respectful online gaming community, ensuring players can engage without concerns about encountering toxic behaviour.
web3
Highlights
- Activision & Modulate's ToxMod AI detects toxicity in real-time voice chats, improving the gaming experience
- ToxMod goes beyond transcription, assessing emotions and context for precise detection, starting with a North American beta
- ToxMod reports are reviewed by human moderators, blending AI efficiency with human insight for a friendlier gaming community
In the realm of first-person shooters, titles like Call of Duty have gained notoriety not only for their thrilling gameplay but also for the less savoury side of their online interactions. Instances of toxic behaviour, offensive language, and heated disputes have unfortunately become synonymous with some gaming lobbies.
To address this challenge, game publisher Activision has joined forces with a company named Modulate, embarking on a journey to introduce a new layer of moderation through artificial intelligence (AI) in their games.
ToxMod AI: A guardian against toxicity
The collaboration has given birth to ToxMod, an AI-driven technology developed by Modulate, designed to swiftly detect and respond to behaviours such as hate speech, discrimination, and harassment in real-time during in-game voice chats.
📣We're thrilled to announce our partnership with @Activision to bring #ToxMod to @CallofDuty: Modern Warfare II and Warzone, plus Modern Warfare III when it launches in November!
— Modulate (@modulate_ai) August 30, 2023
💪🏽Read more on our blog: https://t.co/75mEVcvA4W pic.twitter.com/MN3YxLOfSN
This initiative aligns with Activision's ongoing efforts to foster a more positive and respectful gaming environment. The initial beta rollout of ToxMod has commenced in North America, specifically integrated into Call of Duty: Modern Warfare II and Call of Duty: Warzone.
A broader international release, excluding Asia, is slated for November 10th, coinciding with the launch of this year's addition to the franchise, Call of Duty: Modern Warfare III. Modulate's official statement provides a glimpse into how ToxMod operates.
The tool employs a multifaceted approach, analysing voice chats to identify problematic behaviour, assessing the subtleties of conversations to gauge toxicity levels, and facilitating prompt responses from moderators by presenting relevant contextual information.
Notably, the CEO of Modulate emphasised that ToxMod goes beyond basic speech transcription, taking into account factors like the emotional tone and volume of a player's speech. This depth of understanding allows the AI to distinguish between harmful remarks and light-hearted banter.
A helping hand, not a final verdict
It's important to highlight that ToxMod, at least for the present, will not autonomously take punitive actions against players based on its evaluations. Instead, it will generate reports that will be reviewed by Activision's human moderators.
This cautious approach underscores the potential biases that can arise in automated speech recognition systems, which have been shown to respond differently to users from diverse racial backgrounds and accents.
In essence, while AI offers an invaluable tool in curbing toxicity, human oversight remains an integral safeguard. As the gaming landscape continues to evolve, the collaborative effort between Activision and Modulate signals a noteworthy stride towards fostering a more welcoming and respectful online gaming community, where players can engage without fear of encountering toxic behaviour.
COMMENTS 0