Inspiration
The project is intended to help people who are uncomfortable with talking about their emotions and well-being.
What it does
EmotionSense uses a computer vision deep learning model to detect a person's emotion by their face. It then provides personalised feedback on how to manage your mental health/current emotions. It also gives you locations of places to go, based on how you're feeling.
How we built it
The project was built with react as a frontend and flask as a backend. The frontend displayed a webcam, which sent frames to the neural network on the backend. The user experience is managed by react by dynamically creating and disposing of components.
Challenges we ran into
Deploying a docker container into portainer (a container management system), feeding a video stream through base64 encoding between react and flask, trying to use py-script.
Accomplishments that we're proud of
24 hours no sleep, building a working full-stack app, using new technologies, working on a group project with limited time, understanding deeply how docker works, questionable git commit messages.
What we learned
Integrating all the components together to form an end to end application.
What's next for EmotionSense
EmotionSense to the mooooonnnnn!

Log in or sign up for Devpost to join the conversation.