There are large amounts of text data describing youtube videos, and sophisticated machine learning applications to parse that data. We put these resources together to enable our users to search for music based on how it makes a person feel. This has many applications from music therapy to enhancing general quality of life.
What it does
FeelMusic accesses youtube data in the form of comments on a specific video. We submit those comments to IBM's Watson, which performs a kind of machine learning called Natural Language Understanding. Our API access this data through a web server, and the returns the results to our web page. The user can then sort our songs based on emotion, view the songs emotion values, search for any song they choose, and listen to the songs.
How we built it
Using the .net Core framework, we built an angular front-end, powered by a c# video data API. The API can access a SQL Server Database through our Data Access Layer, or it can access a python Flask web server to get data from Youtube and Watson.
Challenges we ran into
At first, we were unsure how to make HTTP Requests through c#, so we first made the Flask API to access Youtube and Watson data as a proof of concept. Later, when we had more time, we were able to transfer the requests to c#. Another challenge was learning Angular and how to transfer data between different parts of the app.
Accomplishments that we're proud of
Being able to dynamically generate Angular Components. Writing two functioning API's in different languages. Working well together as a team.
What we learned
MVC architecture. Angular front-end. TSQL/ Microsoft SQL Server DB
What's next for FeelMusic
Better sorting interface. User accounts and playlists.
We based our HTTP request off of stackoverflow, because submitting requests is usually standard. The createComponent Method is based on an implementation of componentRef found on stackoverflow.