CW: Suicide

What it does

Safe & Sound is a platform that allows users to define their support network by inputting trusted contacts. Safe & Sound uses NLP techniques to analyze the user's messaging behavior for signs of suicidal ideation. If detected, the platform will send warning texts to the user's trusted contacts, encouraging community members to look out for each other.

How we built it

Our stack consists of a Django backend and React & Redux frontend, with user and contact data stored through MongoDB. To build a dataset for text classification, we pulled from two Subreddits, r/SuicideWatch and r/CasualConversation. We used the Google Cloud NLP API for sentiment analysis to refine our dataset, then trained a CNN model with spaCy. Our UI allows users to register and input contacts' phone numbers. To showcase our functionality, a user can interact with basic text input and a chatbot. When language that could indicate suicidality is detected, the platform sends contacts and SMS, through Twilio, which includes what triggered the alert and the user's location.

Challenges we ran into

There are datasets on suicidal language available for research purposes, but they require strict approval processes for ethical reasons, so we had to create our own dataset. Furthermore, we found that Google Cloud's AutoML takes hours to train, so it wasn't feasible to use it as our final model.

What I learned

The topic of this project is heavy, and it was sobering to get a glimpse of Reddit's r/SuicideWatch, a forum that has been the subject of many news articles and studies on suicide over the past few years. The forum's uncensored context reveals a common thread among anonymous posters: feelings of isolation. This highlights how community care and awareness is crucial in promoting mental health.

Slides

https://docs.google.com/presentation/d/1ejPDUOlGLlYqcJaVpe75NDQ-lZCjsFBthIyavILDBxI/edit?usp=sharing

Share this project:

Updates