'Call of Duty' is using AI voice moderation tools to curb hate speech
Call of Duty voice chat is famously not great. Activision is hoping AI can fix that.
According to a company blog post on Wednesday, the next major COD release will use AI to observe and report on toxic voice chat in multiplayer matches. Specifically, Activision is deploying ToxMod, a bit of AI tech developed by a company called Modulate, to make sure people aren't super racist while gunning each other down in team deathmatch.
For what it's worth, ToxMod is interesting if nothing else. As noticed by PCGamer, Modulate claims ToxMod can listen to conversational context to determine whether or not something is truly hate speech.
You May Also Like
ToxMod also claims to be able to detect grooming language for white supremacist groups. Modern Warfare III will be by far the biggest game ToxMod has been deployed in so far, so it'll be quite a test. One thing to note, however, is that the software itself does not punish players for hate speech. Instead, it merely reports violations to the Call of Duty code of conduct and humans at Activision take further action.
Of all the different potential uses for AI, this one is perhaps less offensive than most of the rest. After all, it's not trying to replicate or even replace human creativity; instead, it's potentially stopping people from having to listen to hate speech. But that also means there could be real limitations to the tech, or ways to circumvent its filters.
We'll all find out over the next few months, as every Call of Duty player becomes a guinea pig.
Topics Gaming
Alex Perry is a tech reporter at Mashable who primarily covers video games and consumer tech. Alex has spent most of the last decade reviewing games, smartphones, headphones, and laptops, and he doesn’t plan on stopping anytime soon. He is also a Pisces, a cat lover, and a Kansas City sports fan. Alex can be found on Bluesky at yelix.bsky.social.