Inspiration

After all my classes shifted to online learning, I found that lectures were even more unbearable than usual. A little bit of research showed me that it's because passive learning translates poorly to all-online classes. One way for students actively engage with the material is through creating their own questions and researching the answers - however, in today's world, you need to be able to distinguish trustworthy sources. This is where Disrespect Detector is useful.

What it does

Disrespect Detector is a Chrome extension that uses Google's Perspective Comment Analyzer API to evaluate the toxicity of the content on a page. The Perspective API assigns comments a "toxicity score", which demonstrate the likelihood that a reader would find the comment offensive.

How I built it

I built this extension using JavaScript, JQuery, and Ajax. It uses JQuery to scrape the page's text in the content script, then sends that information to the background script to make the API call. The toxicity score is returned to the content script and displayed to the user.

Challenges I ran into

I don't have a lot of familiarity with JavaScript or JQuery, so this took a lot of debugging and Stack Overflow. I got most of the way through developing a popup, but struggled to manage the async calls and also keeping the background processes going.

Accomplishments that I'm proud of

This was my first Chrome extension! And my first hackathon.

What I learned

I learned a lot about the Perspective API, JQuery and Ajax, and the structure of Chrome extensions in general. It was a bit of a learning curve, but I'm more confident writing projects from scratch now.

What's next for Disrespect Detector

I would extend this by adding a more developed UI, like a more interactive popup. Incorporating other measures of source reliability would also make this a more effective resource.

Built With

Share this project:

Updates