Inspiration
Places and events all have many stimulations that effect how we experience them like touch, sight, and sound. Pictures are able to capture just one dimension of the experience. In today's world we are hit with many different types of visual clutter from scrolling down news forums seeing photos of riots and protest but not really understanding the gravity of some of the situations. Vibify(vibe-ify) seeks to bring you closer to the event by creating a dedicated playlist of music that is based off a photo provided.
What it does
This service allows you to gain insight into an event or photo in a lower friction way by using something that everyone enjoys, music. The playlist will be generated from the photos will be personalized as the Spotify api takes into account the account's pervious listen history and the inputed keywords. We seek to make it so that people can understand a little bit more about the situations about the world through an enjoyable way. You will also be able to organize the playlist in multiple ways such as the amount of acoustics or liveness. This will allow the user to have a frictionless experience listen to the playlist and not having to skip to find a song for the mood.
How we built it
This is built using Google Cloud Vision API and Spotify's playlist creation API using keywords generated by the Google Cloud Vision AI. We used different part of the Spotify api to organize the playlist in Spotify.
Challenges we ran into
Using the specific api and figuring out how the data would be implemented and stored. The api's proved a challenge to implement and link together. Another challenge was understanding how Spotify would handle its api functions and its own data.
Accomplishments that we are proud of
We were able to implement cloud vision technology and connect it to Spotify's playlist generator API.
What we learned
We learned about how google cloud vision outputs data and how the training of the AI allows for different outputs to be generated with differing levels of confidence. This project showed how the accuracy of input data can highly change the results of the outputs.
What's next for Vibify
When the Cloud Vision demo was presented it was shown that the cloud vision could output envionmental situations such as "competitive". This showed that training the Cloud Vision AI it could learn things that are not directly in the the image such as what the 'vibe' of the situation in the photo might be. For example, a photo of a protest would not just be about the protest but also about the the passion, tension, and grit embodied by the photo. These types of keywords would make the selection of the songs in the playlist be more embodying of the situation presented in the photos. Another source of output is that Spotify is increasing its library with the growing community of podcast listeners and creators. At this time searching and creating a playlist of podcast is not supported in any of the api's but implementation of an similar concept would be easy and frictionless.
Built With
- google-vision
- javascript
- spotify
Log in or sign up for Devpost to join the conversation.