We went into our first Hackathon with the hope of completing a product that was beneficial to the people around us. As Frosh, we grew up noticing the troubling effects of poor mental health on our peers. Most of the time, it ended up being the lack of knowledge of their own behaviour patterns and mental health. We are strong believers in the power of data to transform lives for the better, so we decided to build an app to monitor and provide data of the user’s mental health.
What does InSight do?
InSight uses Google’s Cloud Vision and Keras and Tensorflow to detect facial expressions through the laptop camera.Insight is a desktop executable app that monitors the user’s face and uses Google Cloud Vision to analyze the emotions exhibited by the user and simultaneously records what app the user is using. In the Interface, the user can view all the data in a graph form and log book form to quantify their behaviour patterns. InSight also has an emotion calendar logbook that the user can input to have daily data at their fingertips. In the app, we have a resources tab that has a quote of the day and links for mental health help. For the Cherry on top, we also created an alexa skill for ease of access to use Insight, which responds with the emotion you felt most on that day and the app that was related to it. The simple phrase “Alexa, emotional InSight”
How was InSight made?
After conducting decision matrices to pick the best proposed solution, we decided to split into teams of 2 to tackle the back-end and front-end user interface code. On the front-end, we decided to use electron and Angular to build our app as an executable and chart.js to graph in the data given by the API and present it in a pleasant and elegant manner to the user. For the back-end, we interfaced with the Google Cloud Vision AI in order to accurately retrieve faces and get a preliminary guess on what the emotion may be. Then we used a pre-trained keras model found on GitHub of a Facial Emotion Recognition dataset. We combined the results of the keras model and the Vision AI for an accurate reading. Lastly, we created a online endpoint to serve our data. We build a restful service with Flask, nginx, and gunicorn on top of the Google Compute Engine.
Challenges InSight faced?
We found out that the no InSight team member has had solid experience in creating a UI for a desktop app so two of our members had to invest time in learning how to build effective UI/UX with the help of mentors and online resources. Another challenge was learning AlexaSDK to incorporate it with our desktop app and assign skills. Moving all the computing onto Google’s Compute Engine was also a challenge we faced.
Accomplishments that we’re proud of
Completing a full product in our first hackathon was a win in our book but we were even more happy to implement some features such as Alexa skills and commands to boost our program. The first time our facial recognition worked and returned the correct emotion exhibited was amazing. Having a proper endpoint to make integration easier was also very cool.
What we learned?
We learned how to manage our time well and efficiently work as a team in our first hackathon to present a completed product. We gained experience to produce an effective and elegant UI/UX for our app.
What’s next for InSight?
In the future, we will focus on developing a mobile implementation since people spend more time on their phones than their computers. We also want to spread the word about our VENT service since we truly believe it has great potential to revolutionize teenage mental health monitoring. Going in, we knew our constraints on this subject. We wanted to present the information to the user without drawing conclusions from emotions exhibited. We wish to take the help of psychiatrists to better analyze emotions and improve our app to give suggestions of behavioural changes to better the mental health of the user.
Our pretrained model was sourced from user ivadym on GitHub. Thank you very much! We also based out UI/UX from a Creative TIM Design!