After we attended the NSA Tech Talk, someone told us that we can try to make a program that would match songs with moods. We thought about using an api to take pictures of people and have the api determine the mood based off of the pictures. We thought about using Amazon Echo to play the song after the api talks to it.

What it does

It has links that would take the user to go to external links, as well as determine the best song based off of mood after seeing pictures of nature. One link takes the user to an api that takes a picture and then automatically determines the mood based off of the picture. The other link allows the user to determine a mood and/or genre, and then finds the songs that best match one or both of them.

How we built it

We used HTML to make the front-end web site, but we tried altering an api to get it to talk to Echo, but the latter didn't work.

Challenges we ran into

We had difficulty trying to figure out the api interface to talk Echo.

Accomplishments that we're proud of

We are proud of being able to use external links that can help determine moods and songs for us. We were also proud of having the user think for him/herself in terms of moods and songs based off of nature photos.

What we learned

We learned that whenever we come up with an idea, it is not always easy to make the idea reality. We also learned that we can make alternatives and tweak up our idea a little to make the final project close to actual idea.

What's next for Moodsic

Hopefully, one day we can actually get the computer api to talk to Echo.

Built With

Share this project: