During the COVID-19 pandemic, many people often got bored and stressed due to social distancing and related restrictions. While there are many well-designed apps to alleviate serious mental health issues, many people simply need to have some fun through a plethora of entertainment (movies/music/etc). Nowadays, major entertainment platforms have robust recommendation systems to solve this. However, it doesn’t use the emotional state of users. Hence, a mood-based recommendation app was born.
What it does
The mobile app can offer several movies and music recommendations by analysing the user's face and voice emotion. The app acts as a content aggregator which redirects users to various entertainment platforms such as Spotify and Netflix. To maintain the quality of the curation, mental health experts can perform manual curation through our admin portal.
How we built it
We developed the mobile app using React Native and Expo, utilising Camera API for face capturing, Audio API for recording user’s voice, and AsyncStorage for saving preferences.
To predict the current emotional state of our user, we used existing Machine Learning solutions to analyse emotion using facial expressions and word semantics, as well as an in-house Machine Learning model for the voice tone. These results are combined to classify the user's emotion into one of four basic emotions (happiness, sadness, anger, fear), adhering to a proven Variance-Arousal-Dominance model.
We also developed an admin portal to manually curate music and movies databases, with some metadata queried from Genius and The Movie Database API. The application is deployed in a containerised environment in order to ensure availability.
Challenges we ran into
We encountered several challenges when developing the emotion detection module. Emotion detection is a non-trivial problem, as individuals might have different facial and tonal features for the same emotion