We wanted to create something that involved machine learning with youtube's api.

What it does

Each word of the comments is determined negative or positive using bing lexicon. Combining the differences of the comments determined if the overall video was positive or negative.

How we built it

We used R code along with Shiny app to create the backend and the front end of the project. We were planning to display the data in a wordcloud for presentation purposes, but we ran out of time and were unable to get that far.

Challenges we ran into

Combining our codes was the hardest part. Since we had no idea of how the other's code was it was difficult to decide how we would intertwine the codes. Our biggest problem was when the unexpected happened: instead of receiving words & the frequencies of which they appeared, we only got the frequency numbers. This made it impossible to create a wordcloud, so we scrabbled to make a graph instead.

Accomplishments that we're proud of

For actually being able to get the data from youtube, creating a basic working UI, being able to figure out the system behind wordclouds, and also successfully tallying up the frequency of negative/positive words.

What we learned

To space out big projects like this and plan things more meticulously beforehand. Combining the front end and back end is a lot more difficult than it seems.

What's next for Youtube Sentiments

For more machine learning projects in the future, as well as tying in more API usage.

Built With

Share this project: