Inspiration

Cyberbullying and hate speech is a huge problem in teens and can lead to decreased mental health and worsened school performance.

What it does

This app is an app that maintains student anonymity by acting as a third party to transport messages. It uses deep learning to analyze the toxicity of each message and determines if it is offensive or not. If it is deemed offensive, it is then sent for review for friends that the user adds, or squad, who then evaluate whether it is safe or not. No censorship while maintaining safety.

How I built it

Flask, Firebase, Javascript webapp, python, and dapp (decentralized app) The architecture of the network: convolutional neural networks trained on text vectors (text represented by digits). Trained on https://figshare.com/articles/Wikipedia_Talk_Labels_Toxicity/4563973 Decentralized voting for the safety of messages in squads.

Challenges I ran into

Flask and Firebase were pretty difficult to connect since we couldn't use Node.js which we were comfortable with since we needed to use Python for Machine Learning. Dapps with Solidity and Ethereum for a decentralized voting system (squad votes on if message should be approved) was finished and running on localhost, but we didn't have enough time to integrate it with the final project.

Accomplishments that I'm proud of

A fully functional prototype with a clean frontend and an efficient backend.

What I learned

Flask and firebase Especially Solidity and Ethereum for the decentralized voting system.

What's next for SquadTalk

Add on to chat apps, email, social media sites, and deployment to more functional websites. This is a proof of concept example.

Share this project:
×

Updates