Inspiration:

seeing and dealing with rude and toxic comments on popular forums like youtube, reddit, and being aware that sometimes it might be you who leaves that rude comment, and you may not even realize it.

What it does:

This chrome extension warns you and reminds you not to be too heated if it finds that you are in the process of leaving a particularly rude or toxic comment using google's perspective api - an NLP algorithm for analyzing sentiment. It reads the users comment into an editable text box field in real time, and is able to inform them if their comment is above the threshold before it is posted.

How I built it

  • JS, POST requests to Perspective API, Local Node.js instance

Challenges I ran into

  • Found it difficult to figure out how to find when a user is typing a comment - what text fields are activated? When do we collect a users input? Also, sometimes we spent a lot of time on something just to find out that it was made by someone else already.

Accomplishments that I'm proud of:

Was able to get a working extension running on localhost using js and node.js, none of us had substantial experience in either coming into this hackathon.

What I learned Learned

a lot about javascript, how to build an extension, how frustrating creating an extension can be, but how fun hackathons are!

What's next for TypeMeNot2

Improving the graphics - as of right now, we have a full on alert for toxicity above a certain threshold, but we want to make better representation such as a color fader with a multiplier based off of the toxicity score. Example: icon is bright red for extremely offensive comments, and dark blue for non offensive ones.

Built With

Share this project:
×

Updates