Inspiration
Love towards the music and open source projects
What it does
With the help of machine vision we can analyze the atmosphere of the space. We combine it with Spotify api to match and play the best tracks that suit the current energy of the environment.
How we built it
Our vision comes through camera to our python script which uses opencv detector to detect the movement in the environment and sets up the parameters for the playlist controller.
The controller receives the parameters and plays the corresponding tracks from Spotify API.
Everything is wrapped to web based graphical user interface which displays the video stream, current track and current "meining"(mood/atmosphere) as emoji's
You can se the GUI here junction-spotify.firebaseapp.com
Challenges we ran into
Dev environment setups - "all the classics" "all the problems" "If this was D&D we would have rolled a natural 1" luckily our skills helped us move on.
We have clear vision which parts of the architecture we need to improve and how we can improve them
Accomplishments that we're proud of
Our original plan was to join the health tech track, but we saw we could really step out of our comfort zone at the Spotify track so we came here.
We are genuinely proud that we were able to develop the platform even we had no previous experience towards machine vision etc. Its amazing to see the architecture we designed is now built up & running
That we were able to have so much fun with the long hours of coding and fighting with the dev environment problems.
What we learned
Teamwork is important and when the project it broke down correctly to the right individuals/skillsets, you can achieve a lot in a short time.
What's next for SpotiMeining
Change the perception algorithm to darkflow/darknet YOLO, do some server-client optimization and implement the whole package to our university's party house!
Log in or sign up for Devpost to join the conversation.