Shooter video game Call Of Duty has started using AI to listen out for hate speech during online matches.
Publisher Activision said the moderation tool, which uses machine learning technology, would be able to identify discriminatory language and harassment in real time.
Machine learning is what allows AI to learn and adapt on the fly without explicit human instruction, instead using algorithms and the data it’s taught with to recognise patterns.
The tool being rolled out in Call Of Duty, called ToxMod, is made by a company called Modulate.
Activision’s chief technology officer Michael Vance said it would help make the game “a fun, fair and welcoming experience for all players”.
Toxic voice chat has long been a problem for online video games, with women and minorities particularly targeted.
The issue is exacerbated in popular multiplayer games due to the sheer number of players, with around 90 million people playing Call Of Duty each month.
Rollout begins ahead of next game in series
Activision said its existing tools, including the ability for gamers to report others and the automatic monitoring of text chat and offensive usernames, had already seen one million accounts given communications restrictions.
Call Of Duty’s code of conduct bans bullying and harassment, including insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin.
Mr Vance said ToxMod allows the company’s moderation efforts to be scaled up significantly by categorising toxic behaviour based on its severity, before a human decides whether action should be taken.
Players will not be able to opt out of having the AI listen in, unless they completely disable in-game voice chat.
It has been added to Call Of Duty’s Modern Warfare II and Warzone games so far, but only in the US for now.
A full rollout will begin when the next instalment, Modern Warfare III, launches on 10 November.