Inspiration

I see the addition of spatial audio and exploring spaces physically to hear a variety of songs to be a very powerful and interesting medium to experience music. With the broad API capabilities offered by the hackathon sponsors, I saw an avenue for utilizing interesting auditory and visual assets to pave the road for incredible interactive and collaborative experiences for fans in the future that can form a new channel for artist discovery and fan engagement.

What it does

The user enters an augmented reality portal and walks around visiting areas that display musical artist material such as album covers, side passions, etc. When they walk close to an area, a song is triggered and begins to play, with content retrieved through the 7digital API and offered by Universal Music Group. If the user walks away the song gets quieter and if they get far enough the song stops playing. The hackathon version is meant to set the vision for a space of experiences driven by advances in augmented reality capabilities of iOS and Android devices in the hands of millions of fans.

With the beginnings of a user space that lets users interactively explore the 7digital library of amazing artists/content from Universal Music Group, the next step is customization based on interactions and flexible search-ability of the various music-related API's. I build out functional endpoints for 7digital, BuzzAngle Music and qloo API's and put in placeholders for Spotify, Shazam, Sound Cloud, Twitter and YouTube. I have a nice architecture in my notes that unifies several of these datasets into something very compelling for users. As an example, here's a major flow I imagine: BuzzAngle Music API's give the top popular results (sales / streams) to highlight the likely major artists of interest for the user. This data is cross-referenced with the 7digitial search API to determine which of the artists have content covered under valid license arrangements such that they can be displayed in the AR experience. Then qloo takes this subset of artists/songs and creates complementary passion areas and recommendations that could help fill out the space in a fun and engaging way. With that YouTube and Twitter can be used for social signals related to those passion areas. Another solid input pipe that I'd love to integrate are the Spotify API's which offer very unique capabilities such as cross-referencing and searching for Mood elements of the music like Danceability, Valence, Energy, Tempo. Lastly, a very exciting possibility of the AR experience is creating a hall of influence for particular artists. qloo comes in perfectly here as well as some input from Spotify to take the users on a journey of how an artist got started and changed over time. Fans could help contribute their interpretations of the artist's trajectory to stardom.

With a collaborative AR space, some other possibilities are interesting such as dancing in the space, writing and exploring lyrics and other ways that users can interact related to the artist and their content. The ubiquity of iOS and Android devices is what sets something like this apart from the more advanced but also prohibitively expensive devices.

How I built it

Unity with the AR Foundation package and Pocket Portal VR plugin. API's used: 7digital, qloo, BuzzAngle Music Graphical assets from asset store. Custom backend expressjs server for processing various 3rd-party API endpoints and normalizing them for the app's purposes. Unity AudioStream with low-level FMOD Studio support for assistance in playing audio streams from 7digital API within Unity.

Challenges I ran into

I wanted to compete in the BuzzAngle predictive analytics contest but a dataset was not made available. I started working on scraping the API but the service was very delicate and often gave errors. The API document could use some loving with respect to input formats for required values. I opted to save on time and consider the API capabilities for my app rather than scrape to get a dataset I could experiment with some ML prediction algorithms with.

The 7digital streaming audio required a custom oauth HMAC-SHA1 signature that was not documented other than the sample webpage that generated valid requests. I spent quite a bit of time late at night tracing through how that worked so that I could replicate on my side since I proved out that my streaming audio was working with valid 7digital audio stream URLs. Since the requests need a recent timestamp that adjusts the signature for the request, I needed to have that piece in order to wire up the streaming audio completely to have access to the entire 7digital library. There also looks to be an old and unsupported nodejs library for 7digital that was distracting since things have changed.

Accomplishments that I'm proud of

The experience is enjoyable and fun to do and show around. I think it is a great way to listen to different artists and I can see many possible enhancements that continue to make it a really great app. I'd previously done portal experiences with ARKit and AR Core separately and this was my first time with the AR Foundation packages from Unity. I'm super happy to be able to take advantage of the easier cross-platform capabilities of this approach versus some of the manual steps I needed to do to ensure iOS and Android versions co-existed in the same code base.

What I learned

I learned a lot about streaming audio, which is a big win for new experiences going forward. I also learned more about some other approaches to making REST API calls from Unity. In the past I've either used a library that abstracts away the internal calls to an endpoint or used an ad-hoc solution that wasn't perfect. This time I found a nice library that was open source and flexible. I still have some more to nail down on solidly connecting my strong nodejs/js skills with my Unity skills. The data wrangling is much faster for me in nodejs land and therefore I try hard to push as much to that portion of the overall code.

What's next for MusicDiscoveryAR

  • Wire up hot-loading of album art and streaming audio from 7digital (very close) so that rooms can adjust dynamically based on user actions and preferences
  • Include ARKit multiuser AR capabilities and ARWorldMap
  • Allow users to share their room
  • Track interactions from users to help them get the content they want\
  • Integrate machine learning photo recognition mode that recognizes famous locations (especially music venues) and creates AR Music Discovery rooms based on the location and region.

Unity Assets

Built With

Share this project:
×

Updates