We wanted to make an app that introduces people to new music. We also wanted to implement IBM's Watson API because we thought it was pretty cool. Eventually, we decided to make an iOS app that introduces people to new music based on their current mood.
What it does
Mello analyses the user's speech pattern and recommends new songs to match their emotions. The app allows the user to save their favourite songs and export to their music library.
How we built it
Using Xcode, Watson API, Watson Bluemix
Challenges we ran into
-Getting the music to start, stop, play/pause. -Getting the tabs to share information -Storing information in TableView -Implementing the Watson API
Accomplishments that we're proud of
-Getting the music to play -Watson API successfully interprets user text input that we then use to play from a recommended playlist of songs
What we learned
We all learned how to build an application in.
What's next for Mello
Adding a rating system in the table. Developing a ‘share-with-friends’ feature. Adding more emotions to choose from. Meet people who share your music preferences. Integrate a streaming site like Spotify to stream music directly.