This new arrival to Call of Duty promises to make venomous gamers salivate

This new arrival to Call of Duty promises to make venomous gamers salivate

Activision will deploy an unparalleled operator in the upcoming Call of Duty: Modern Warfare III. His mission will be to combat player toxicity, a behavior that is unfortunately very popular in this online game.

Call of Duty: Modern Warfare III © Activision

All Call of Duty players know very well what we are talking about. Toxicity is rampant in the franchise’s multiplayer mode. Some use voice chat to Insulting, threatening and harassing other playersetc. Activision is well aware of this problem and has been trying to solve it completely for several years. Then, the studio will start working on the next installment in the series, Call of Duty: Modern Warfare III and its nostalgic maps.

Activision announced on the official Call of Duty blog A new recruit arrives In Call of Duty: Modern Warfare III. Recall that the third Modern Warfare reboot is expected on November 10, 2023 on PC, PS5, PS4, Xbox Series, and Xbox One. This new recruit is an AI, but not just any AI.

Read also: To fight toxic players, Intel wants to use artificial intelligence!

ToxMod AI to come to the rescue of Call of Duty: Modern Warfare III to combat toxic behavior

This AI, called ToxMod, aims to combat toxicity in Call of Duty online games. Related Real-time voice chat monitoring system. It will be able to identify violations such as hate speech, discriminatory speech, harassment attempts, etc.

Before integrating ToxMod into Call of Duty: Modern Warfare III, Activision plans to launch a beta version of its tool on… Call of Duty: Modern Warfare II and Call of Duty: Warzone. He will be available this weekend, but will only monitor players who speak English.

READ  White and wooden kitchen with center island: a must-have for your home!

Clearly, ToxMod’s AI isn’t Activision’s only attempt to combat toxic behavior. It already exists Text message filtering system It is shared in the game’s written chat, and supports 14 languages ​​to catch as much abuse as possible. Moreover, the players also have In-game reporting tool Allowing them to immediately report toxic behavior.

To give you a clearer idea, Activision has already restricted it More than a million accounts Who didn’t respect the rules of Call of Duty. The good news is that 20% of these players did not commit any crime again after their first warning. Hence, the arrival of a player voice monitoring system using artificial intelligence would succeed in boosting these numbers.

source : Activision

Leave a Reply

Your email address will not be published. Required fields are marked *