Toxicity detection in multiplayer online games

Marcus Märtens, Siqi Shen, Alexandru Iosup, Fernando A. Kuipers

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

Abstract

Social interactions in multiplayer online games are an essential feature for a growing number of players world-wide. However, this interaction between the players might lead to the emergence of undesired and unintended behavior, particularly if the game is designed to be highly competitive. Communication channels might be abused to harass and verbally assault other players, which negates the very purpose of entertainment games by creating a toxic player-community. By using a novel natural language processing framework, we detect profanity in chat-logs of a popular Multiplayer Online Battle Arena (MOBA) game and develop a method to classify toxic remarks. We show how toxicity is non-trivially linked to game success.
Original languageEnglish
Title of host publication2015 International Workshop on Network and Systems Support for Games, NetGames 2015, Zagreb, Croatia, December 3-4, 2015
PublisherACM, IEEE Computer Society
Pages1-6
Number of pages6
Volume2016-January
ISBN (Electronic)9781509000685
DOIs
Publication statusPublished - 13 Jan 2016
Externally publishedYes
EventInternational Workshop on Network and Systems Support for Games, NetGames 2015 - Zagreb, Croatia
Duration: 3 Dec 20154 Dec 2015

Workshop

WorkshopInternational Workshop on Network and Systems Support for Games, NetGames 2015
CountryCroatia
CityZagreb
Period3/12/154/12/15

Fingerprint Dive into the research topics of 'Toxicity detection in multiplayer online games'. Together they form a unique fingerprint.

Cite this