I was listening to the radio in the car but obviously couldn't reach for my phone and open Shazam to find out the name of a good song when i heard one. Then I thought, what if there was a hands free way to solve this problem?
What it does
When a user hears a song they like but do not know the name of in the car radio for example, he or she can simply use a hand gesture to detect it and push it to his/her Spotify playlist
How I built it
I created a plan of a couple steps and went by that. I started by making sure the myo gestures worked on the code, then I began working on the Spotify API. I went went bit by bit and slowly put everything together, with UI/UX as a finishing touch.
Challenges I ran into
Parsing JSON in Objective C, latency with Spotify API, and especially the accuracy/reliability of the Myo.
Accomplishments that I'm proud of
I'm proud of Integrating cool hardware into my project for the first time (myo armband).
What I learned
How to work with Myo armband as well as objective c, Spotify API
What's next for musify
I would continue adding algorithm and features to assure that the right song is added to Spotify, and maybe incorporate some machine learning practices to show more songs that the user would like.