Inspiration
There is so much going on in one's life. People are unconscious often of how they are truly feeling at any given moment or are going too fast in life to even have time to reflect. Our team wanted to bring intelligent technology that gave more insight into who you are as a person without a person having to consciously stop and think. Music is something everyone enjoys through different genres, and it tells a lot about the type of person you are. We wanted music to be a way that technology "autoreflects" on who a person is as they are constantly changing and growing everyday.
What it does
With an Android smartphone, one can scan a group of people using the front or back camera. Using the image, the facial expressions of each individual is saved to a JSON file containing the different levels of emotion their expression gives off. From there, music notes are related to the different emotions that are found to produce a melodic tune that is guaranteed to mix well with various pitches combined. The emotion data is found using Microsoft's API Azure.
How we built it
Our team used Android Studio to develop the front and back end of the application. The Microsoft Azure API was integrated into the project for emotional analysis based on several people's faces. JSON was generated to pull data for creating music based on the faces of an image.
Challenges we ran into
It was extremely difficult setting up the MIcrosoft Azure API correctly at first, where network communication to the API was not giving back any response. With the documentation available online, we managed to connect the dots and set up our project to use the API functionally rather than the Microsoft Project Oxford backend directly. It was also difficult to receive valid JSON objects at first since the conversion from files to bitmaps to a byte array for obtaining JSON objects required a lot of specific pieces of code to ensure a proper object was sent back. Ultimately, we were able to make our project get the data we needed.
Accomplishments that we're proud of
Our application is very robust and exceeded our expectations for the amount of the functionality we were able to bring to reality based on our idea. We also started the project entirely from scratch, learning a lot about Microsoft Azure's API functions such as the need for the implementation to be a*synchronous* in its code structure.
What we learned
Being able to persevere through a tedious or "impossible to solve" problem is definitely worth it because we overcame several hurdles in our ability to build what we wanted, but the satisfaction of overcoming those hurdles brought to our team was unlike anything else.
What's next for Memonic
Ultimately, the app could have other varied use cases for making emotions connect to music. Music brings various types of feelings we love to experience as humans, and having the right song at the right moment can positively impact one's day no whatever how one is feeling. Bringing music into memories in your photo gallery as well as your day to day life in the present are the places where the app's music can intelligently play a tune one loves. On the emotional analysis section, it can be broadened into other aspects of life, such as suggestions about what one should do or go to based on their mood.
Log in or sign up for Devpost to join the conversation.