We spend a lot of our time sitting in front of the computer. The idea is to use the video feed from webcam to determine the emotional state of the user, analyze and provide a feedback accordingly in the form of music, pictures and videos.
How I built it
Using Microsoft Cognitive Services (Video + Emotion API) we get the emotional state of the user through the webcam feed. We parse that to the bot framework which in turn sends responses based upon the change in the values of emotional state.
Challenges I ran into
Passing data between the bot framework and the desktop application which captured the webcam feed.
Accomplishments that I'm proud of
A fully functional bot which provides feedback to the user based upon the changes in the emotion.
What I learned
Visual studio is a pain to work with.
What's next for ICare
Use Recurrent Neural Network to keep track of the emotional state of the user before and after and improve the content provided to the user over the period of time.