The story of 4DIO

Our vision was to create an immersive virtual audio environment that would move in real time with the user. We are a team well-versed in the hardware side of hacking, and wanted to incorporate both hardware (electronics and mechanics) and software hacks together.

The general idea of our project was to 1) record audio in a 3-dimensional format (allocating sound to a direction it is coming from), 2) have a middle-man software interface which could take sound sources and convert them into htheir respective headphone amplifications, and 3) a system to the detect the yaw rotation of the head of the listener and then play the sound that comes from the corresponding virtual direction.

The first part was attempted by designing a phased microphone array, and then trying to replicating recorded sound into three sound sources around the array. The idea of the third stage was to use deep learning to detect the direction of the head in real time, and then feed that value into the middle-man software. However, these two systems proved to be difficult to create during this hackathon.

The second part contains a python script where it places virtual sound sources in 1~3 locations, and uses that information to determine the left and right sound ratios. The placing is done by a simple web UI hosted using a flask server.

While we weren't quite able to reach our ambitious final goal, there was plenty of interesting development throughout the stages. With all of the team members working on different aspects of the project, we learned a good deal about the technologies involved in our idea and have produced a prototype we are glad to present as a proof of concept.

Built With

Share this project:
×

Updates