We were inspired to help prevent toxic and unsafe work environments. To give a voice to people who may be too embarrassed or feel unsafe to bring it to the light.

What it does

This is a real-time chat app that analyzes sentiment of all messages to help detect and prevent any abuse, violence, or harassment between co-workers.

How we built it

We built using a React.js PWA frontend and Node.js/express.js backend. To create a live and real-time environment we used and Firebase. We deployed the API on a Heroku cloud server and our frontend client on Firebase. We used Google's awesome and powerful NLP tools to analyze the sentiment score and magnitude behind each message.

Challenges we ran into

Trying to figure out where the line for harassment, abuse, and violence is evident. This is really important to take into account to prevent any wrongful accusations or get any innocent parties involved as well as making sure everyone's voices are heard. We also chose to keep the parties involved anonymous to protect privacy and safety, in order to prevent bias, untruthfulness, and maintain integrity - until the proper authorities investigate the situation.

Accomplishments that we're proud of

2 person team! Using the PWA architecture so that people can download it, we worked really hard to make it look and feel like a smooth native mobile app (if you use it on your phone)! Shedding light in situations where people wouldn't have their voices heard. Also, the chat and admin monitoring page update in real-time so no need to annoyingly refresh the page. We do the work for you :)

What's next for chatalyze

With more time, there would be more features in place to increase privacy and security. In such a difficult time we would like to keep all parties involved as safe as possible.

Share this project: