We were inspired to build this web app because we noticed that many people struggle to express their feelings, and art should always be an outlet to do so. We then thought of the fact that we wanted users to be able to literally share their feelings with those they cared about without having to worry about finding the right words to describe them. One last motivator for coming up with this web app was to make a more inclusive music app; Lambda is for everyone, and it can even be seen as Spotify for the deaf.
What it does
Lambda's mission is to allow music enthusiasts to experience songs like never before: imagine you could bring to life the very atmosphere you feel when listening to a certain song by creating a visual mood board catered to your aesthetic tendencies. Lambda allows its users to not only explore and discover new genres of music based on what pictures they like, but it also puts together a personalized playlist to match their visual preferences. The user can then listen to the suggested playlist while still going through their mood feed, allowing machine learning to constantly update their playlist. In essence, Lambda is the ultimate bridge between moods and how we imagine them to sound.
How I built it
We wanted the initial input from the user to be a series of pictures that they felt best described the atmosphere they were trying to set. Then, using supervised learning, playlists are engineered to be catered to the desired mood by linking music genres to keywords from the selected images. Basically, the main difficulty lies in the fact that we had to use machine learning to associate moods with picture tags.
Challenges I ran into
Using Shutterstock's API, it was possible to quickly select and relate images to themes, however as the confidence of the computer vision was unknown, additional filtering was necessary to ensure a proper display of images for the desired atmosphere or mood. This meant that we needed to spend a bit more time on the picture selection, as it was what we'd then feed as data for the machine learning part of the web app.
Also, we tried using Octave's API for creating a curated playlist at first, which made it harder for us to actually implement the mood-to-genre mapping, since we could only search songs by track names, and not by genres.
Accomplishments that I'm proud of
As the team had mixed levels of skill, the more experienced coders were able to help the others improve their abilities while distributing tasks in order to keep everyone continuously challenged.
What I learned
It is very important to have a concrete plan of the objectives to be achieved early on, and communicate each other's ideas in more than ways than one. The most important quality that was solicited was the ability to adapt to variable change such as the fact that Octave's API only allowed us to search songs by track name, whereas Spotify's API allowed for more of a diverse search engine.
What's next for Lambda
Develop the "My Mood Boards" feature as a way for users to display their selection of images while playing the corresponding playlists. Also, a main goal will definitely be to include a social networking aspect in which users could share their mood boards with their connections, allowing them to better express their feelings and better explain the vibe they're in because of a certain genre.