Call of Duty takes aim at voice chat toxicity

After doing something to curb cheating in their games, those behind the Call of Duty series are now focusing on ridding then of toxic voice chatting. To do so, they will team with Modulate, to provide real time voice chat moderation.

Read on for more:

Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more. This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.

“There’s no place for disruptive behavior or harassment in games ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming. With this collaboration, we are now bringing Modulate’s state of the art machine learning technology that can scale in real-time for a global level of enforcement,” said Michael Vance, Chief Technology Officer, Activision. “This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players.”

An initial beta rollout of the voice chat moderation technology will begin in North America on August 30 inside the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone™, to be followed by a full worldwide release (excluding Asia) timed to Modern Warfare III on November 10th. Support will begin in English with additional languages to follow at a later date.