Inspiration
When we go home after a busy or eventful day, we like to relax with music. With the theme of this hackathon being an exploration of the relationship between house/home and the person living in the home, we wondered if we could build something where the house itself could interact with the human (ie. perform some action based on the state of the person). Our idea is, when a person enters their home, a camera in the wall checks the person's mood and plays music with respect to that mood in an attempt to either enhance or adjust the mood at hand (these behaviors are customizable in the app). To do this, we built a small proof of concept device where we use the camera on a smartphone to capture a photo of one's face and then do photo processing via an emotion recognition api, finally choosing a track to play based on the emotion returned from the api.
What we learned
We learned how to implement the Azure api via Android and we learned more about Android app development and used some new apis (media player, for ex).
How we built this app
The input device is the camera on a smartphone. Computation is done on the smartphone and thru the Azure api, and the output device is the speaker on the smartphone.
Challenges we faced
Version control challenges and trouble with connecting the smartphone to the Azure api (sending requests / receiving responses).
Log in or sign up for Devpost to join the conversation.