Since the development of 5.1 surround, lot has has changed in VR capabilities. The fundamental technology of playing waveforms over speakers is the same, and the idea of using VR in audio can be applied to blind people.

This concept can be used by blind people to detect objects around themselves(currently only sound emitting ones). For static objects like doors, we can use audio spatialization to fool the user that the sound is coming from a particular direction. We can add image processing to the VR so as to detect objects.

What it does

Gives an insight on how Oculus VR can be used to give rich virtual surround sound experience to an user by tracking user's head movement and orientation.

How I built it

As Audio VR, is a complex project, we made a simple demonstration of how a moving sound source can affect the intensity of sound reaching the listener using Oculus Audio SDK. We used Unity engine to create a virtual world that had the objects creating sounds.

Challenges I ran into

Integrating Oculus in Unity, work on the code to place a virtual ear around the sound source and make it move.

Accomplishments that I'm proud of

As it was a first hackathon ever for all the members of our team, we're proud of working in the field of virtual reality and learning tons of other stuff like game engines in the process.

What I learned

Got an idea about Audio spatialization, got to get a first hand experience on some of the coolest hardware currently out there.

What's next for Immersive Audio demonstration using Oculus Rift VR

The ability to track the user's head orientation and position significantly empowers audio technology.

Share this project: