Inspiration

The Musing team has actively been producing live streamed live music events since 2018 together with many established live music producers and artists.

These activities increased exponentially since early 2020 due to the sudden urgent need of live streaming within the live entertainment industry, leading to much learning and understanding of the strengths and weaknesses of the online media format for live concerts.

During live-streamed shows, the lack of real-time social and musical interactions between performers and online audiences often hinders the events from delivering the feelings of “now”, “here” and “together”, thus devaluating the experience for artists and music lovers.

At the same time the tech, including state-of-art machine learning, MIR, and interactive online environments, should be mature to support innovative solutions to these problems. Moreover, it should be possible to utilize these technologies to actually add new value to the live music experience by connecting people in immersive musical environments online.

This inspired us to develop the concept for a devoted virtual music space, to enable a more synchronous experience of live music together with others online. The aim is to let people around the world co-create online live music experiences together by musical expressions through body movements and sounds, to enable a more interactive and engaging platform for live music online.

What it does

Music Matrix is a web app generating an interactive audiovisual backdrop for a live music performance in real-time, based on musical features of the live music audio from the performing artist, and the body movements from the online audiences.

These musical expressions are then embodied in the virtual environment by machine learning models accumulating the user's movement input, synthesizing and synchronizing it with the musical audio input, and feeding it back into the virtual environment as a real-time audio-visual feedback on user interactions, that forms a musical action-perception feedback loop within the virtual music environment.

By enabling movement and dance interactions in real-time, with responsive visual feedback synchronized to the live music, the Music Matrix app enables performing artists and audiences to co-create the audiovisual experience of the online live show.

How I built it

We developed an audio-visual interface in Unity, with 3d objects responding to musical features of the live audio signal, that the content creating musician users may input from their device's microphone to generate the virtual environment.

For the input for the online audience users, we integrated PoseNet into Unity, to retrieve the concert audience's body motion via the web camera and let this input influence the graphic components for music visualization on the user interface.

The real-time music audio, and the user's body movement input, are synchronized by a real-time beat tracking algorithm. The rhythmically synchronized audio and movement data is retrieved from Unity and encoded into MIDI by the Magenta.js Onset & Frames model. This MIDI data stream is fed to the magenta.js GrooVAE model, which as output emits a latent representation of the user's rhythmical input, and the rhythmical input accumulated by all the audience users, and the music, in order to interpolate between these latent spaces merging the inputs from the audience users with the live music. The output from the GrooVAE models is then called by the user's movement interactions via PoseNet, and sent into the Unity environment to trigger rhythmical music visualizations synchronized with the music via the beat tracker.

Challenges I ran into

Getting live streaming into Unity - we implemented live streaming via Wowza streaming cloud to the web to use as the artist user's input to the app, but as HLS streaming was not natively supported in Unity we ended up using only audio input via the device microphone, leaving integration of audiovisual HLS live streaming input as a scope for the near future.

Accomplishments that I'm proud of

Working together in a creative and productive process with a great multi-talented team with a presence around the world on four different continents.

What's next for Music Matrix

We have many ideas and concepts that we envisioned or partly implemented and want to complete and include in the app in the near future. For example, we developed a script for real-time retrieval form a live music audio signal of a distribution over associated genre-tags to the music, that we want to integrate with the app to let it influence the interface rendering according to the style of the live music, e.g via visual filters utilizing image style transfer.

Interactive live music visualizations, developed in Unity and enhanced by Magentas ML technology, is really a limitless concept. We feel that we just began gently scratching on the surface of the potential of this during the hackathon, and will continue to develop the project further in the near future. In particular, we want to involve this project within our activities as live music producers, to let the results reach end-users directly within our network and test group of artists, musicians, and music lovers, and bring added value to real-life live music productions. During this autumn we will take part in producing several live music arrangements with professional artists, venues, and producers when we want to continue to develop and use the interactive online backdrop. The development of the project beyond this hackathon will moreover in part be as an open-source experimental platform för research in user-oriented MIR and machine learning within live music performance contexts.

Built With

Share this project:

Updates