Image of interaction through google assistant on a phone
I listen to a lot of music from archive.org. While the internet archive has the ability to stream music, the only decent option to stream is through their website, which gets annoying.
Here is a link if you are not familiar. There are more well known artists on here, such as John Mayer and Jack Johnson. https://archive.org/details/etree
I wanted to take advantage of being able to search by voice to play music from the archive at home, so using google actions seemed like the best option given that I already had a Google Home!
What it does
You can start the app by asking google to talk with "Live Music Archive".
Afterwards, it will prompt you what artist you want to listen to. You can use key words like "give me the latest show" or "play me a show from 1979" and it will deliver!
If it cant find a show, it will let you know.
Once a show is found, it will load it up onto Logitech Media Server ( Google actions doesnt support casting sound longer than 120 seconds yet =( ) and give you some details about what you're listening to.
How I built it
The majority of the app is written in nodejs. There is a webhook that listens for user input in order to control the Logitech Media Server.
The user interaction and NLP is done through Dialogflow. I use it to parse all the artist names, dates, and key words like if they would like the newest show. If not all of the data is received, it will prompt the user to say it.
Challenges I ran into
My biggest challenge was lack of time. With the long drive and working by myself, I only had about 20 hours to work on the entire project from start to finsh. I am very happy with how it turned out, but if I had a full 36 hours to work I definitely could have added more features. Overall, I did not run into any significant road bumps.
Accomplishments that I'm proud of
Figuring out how archive.org's insane catalog works through the API. There is a lot of data to filter and sort through, so it took a while to figure out how to get what I wanted!
I am also glad to have figured out how to actually stream the music to a player without any need for downloading. This really makes the user experience great.
What I learned
archive.org has a LOT of data to use. You can do some cool things
NLP has also became super easy to make it do what you want it to. I remember trying to make a google home application over a year ago and it was a major headache. This time it was nearly seamless.
What's next for Live Music Archive with Google Actions
I'll be using this project for sure! I hope to add some card data with the new version 2 API (it was recommended to use version 1 for now) so that I can view the archive page from my phone after it chooses something.
I also plan to add more features in terms of choosing a specific date, choosing between multiple sources, and other odds and ends. By building up more features, the voice interaction will become more and more natural, and you really will be able to ask it to play something exactly how you want to.