Inspiration

Our project was inspired by a vision to make the magic of live music experiences available to everyone, especially those who may find concerts difficult to attend due to accessibility challenges or financial constraints. With virtual reality, we aimed to bring the concert to the living room, allowing users to create personalized experiences with customizable effects like lasers, colors, and more. This way, everyone can enjoy a unique, immersive music experience that feels both personal and accessible.

What it does

Upon entering the experience, you find a table in front of you - choose your settings, then watch the effects as the music plays. The goal of the experience is to visually enhance how you listen to music, so as you listen you can add effects to the environment, like you would find at a concert.

How we built it

We built this experience for a Meta Quest 3 headset in Virtual Reality, using Unity & C#. We developed in Unity3d and used an XR interaction manager to test the experience while the headset was being used or setup. We centralized the experience around a control table with buttons, dials, and sliders, that control the effects. We also decided that we wanted the music to synch with the effects as well.

Challenges we ran into

We ran into a couple major issues over the course of development. The first was with our headset, which struggled to connect to our computer, taking us around a day to finally get our first tests running. We also had a few major issues with Unity, from infinite compiling to infinite null pointer exceptions, which severely hampered our ability to write code and test it.

Accomplishments that we're proud of

We came into this hackathon with a lot of motivation and excitement, but with a lack of experience in XR development, so the thing we're most proud of is our ability to adapt to a lot of unknowns in the situation, re-adjust our expectations and workflow, and still come out with an enjoyable product.

What we learned

We did a lot of Unity learning, from essentially no experience in Unity coming into the event, so we learned almost all of the things we did from scratch. We also needed to do some learning on C#, as we hadn't all done a lot of work in C#. We also did some learning in Fast Fourier Transforms, which was useful for syncing effects to audio.

What's next for Lazer Home

We have a couple of main goals to further innovate on the home listening experience: Firstly, the more time goes into the development of a product, the better it will be, and there are a few things that some more time and cleanup would really improve, such as animations, effects, and customizability. A good example would be using more FFT data to sync more effects. We also foresee using a free API, such as SoundCloud, such that instead of having to upload audio files manually, we can seamlessly access an enormous library of music files. Finally, we are planning to implement Generative AI to generate custom effects to be synced to the music, for example custom generating blue star-shaped lasers coming up from the ground.

Built With

Share this project:

Updates