Our prototype is an app that tracks emotions over time using facial expressions captured by a camera, providing insights into mental health and delivering real-time personalized affirmations. Future updates will enable users to log activities like food, exercise, and social interactions, helping to identify correlations with emotional patterns. We also plan to expand the feature to mobile cameras for more frequent emotion tracking throughout the day.
The mental health application uses Python and Flask for the backend, where it captures a short video clip via the webcam when a user logs in. This video is analyzed using MATLAB for emotion detection, and the detected emotion is returned to the backend. The backend then stores the emotion, along with timestamps and other relevant data, in a MongoDB database for later analysis. The React frontend communicates with the Flask backend to initiate the emotion analysis and displays the detected emotion, providing affirmations or prompts based on the user's feelings. Additionally, the frontend retrieves and presents data from MongoDB, showcasing emotional trends and potential triggers over time. This setup effectively leverages MATLAB's analytical capabilities while ensuring seamless integration between the backend and frontend components of the application.
Log in or sign up for Devpost to join the conversation.