After joining many different, unmoderated servers on discord, we noticed how bad the toxicity could get. People, after a heated discussion, would start constantly hurling insults at each other, and due to a lack of a moderator, there was no way to stop them. This toxicity creates an unsafe environment for the rest of the members in the channel, and the people causing the toxicity are also harmed. The tension caused by the insults often detract from the original purpose of the chat and causes discomfort for everyone in the server.
What It Does
What our program aims to do is to prevent these toxic situations from occurring, specifically on the chatting app of discord. We created a discord bot that was able to recognize negative messages that were in the chat. Each user in the server is given a score, and when a user says something negative in the chat, their score decreases. Eventually, if their score is low enough, the bot warns the user that their score was low. If the user continues to send toxic messages, the bot would be able to ban the user from the server, getting rid of the source of the toxic behavior. After a warning, the user can only regain their score by waiting for a set amount of time. By the time the user's score is back up, the reason for their toxic behavior would most likely be gone.
How We Did It
In order to create this bot, we used python as our language. We were able to make the bot using the discord.py API, Mongodb, and two Tone Analysis API's. The discord.py API allowed us to create a bot on discord that had access to all the information on the server, so this is what we used to collect the toxic messages. The Watson Tone Analysis API as well as Word Vectors API used machine learning in order to allow the bot to judge how negative the message really is. Mongodb provides a database that allows us to record the information from the server. This information includes the actual server the bot is on, the users on the server, and the score each user has in order to judge their toxicity. By collecting, analyzing, and recording the messages on discord, we are able to detect when a user is too toxic and act accordingly, either with a warning or a banning of the user.
Log in or sign up for Devpost to join the conversation.