Help fans and artist connect through fandom. Utilizing spatial computing to focus on a point of interest while engaging people at a distance.
What it does
Allows fans to interact with artist media connect in a shared remixable virtual spaces.
How we built it
We built it with Unity and Android stream over Amazon S3. We used the volumetric assets provided and designed a customizable stage that plays video much like an LED screen would in the physical world. Using volumetric lighting, particles, and device user input in unity, the user can communicate how the music makes them feel and save it to their phone, send it as a text message, or upload to their favorite social media platform.
Challenges we ran into
Installing and getting the custom made plugin that rendered the volumetric asset was difficult at first and took a long time. There was very little documentation available and so we had to work collaboratively.
Accomplishments that we're proud of
The scene we created is a nod to the tradition of sophisticated elegance in performance that Motown is known for. We dove deep into the brief and had great explorations into the viability of immersive content as viral media. The result of our thinking produced a variety of format experiments that we could test. We also reflected on current user behaviors such as using device sensors to affect visualizations of AR.
What we learned
We learned that 5G will allow us to produce richer experiences with a higher computational load. This means video streaming within objects, high poly 3D models.
What's next for Music Feels
We will test our build with user inputs such as cardiograph, EEG sensors so that the particles, lights, and volume can be a reflection of how the user is experiencing the music.