We were initially inspired by synesthesia and trying to visualise that in virtual reality, as well as syncing it with audio. However, we ran into issues trying to manipulate the backgrounds in a way that would best represent synesthesia.
What it does
We chose to continue with audio visualisation in some way, and created a new way to experience audio. The software generates spheres, associating each sphere with a range of frequencies. These spheres would scale in size with the amplitude of said frequency. For added visual stimulation, we also made a skybox that changes hues randomly over time, as well as making the spheres rise up from the ground.
How I built it
We created this in Unity, coding in C#.
Challenges I ran into
Issues that rose include: the fact that most of us had no experience with Unity whatsoever; the difficulty in creating filters that could distort the background (idea abandoned); learning how to convert an audio signal into something usable in code; and trying deploy it onto an Android device.
Accomplishments that I'm proud of
We are proud that we actually have a product to show, deployed onto an Android device, that works with the Samsung Gear VR. We managed to combine several simple ideas to make an interesting scene in VR.
What I learned
We learned how to use Unity and code in C#.
What's next for VR Audio Visual
If possible, we could implement a UI, where users can choose their own audio track to play through the software, and exit the software at any time. We can also experiment with filters, make the skybox hue transition smoother, revisit the image distortion idea, and try make the movement less linear. This project could also be ported to other VR platforms.