Alice is a reality distortion device based on the Oculus Rift. It distorts a real-time stereo view of your world to the sound of music. Getting the stereo-vision right was probably one of the hardest hardware features to nail. The prototype is fairly precise. We pulled up data for Interpupillary distances in the general population to determine the ideal distance. We decided on 2.5 inches as the ideal IPD and this setting then had to be hardcoded into the Oculus’s settings to enable realistic stereo-vision. As a result, wearing the Oculus makes the user feel as though their eyes are placed further away from your face. We then created audio listeners so that we could augment the user’s vision in sync with the beats of a song. This was achieved with open source PyAudio code that was originally intended to identify a tap or clap-like sound. That system was optimized specifically to pick up a sharp, percussive sound in a quiet environment so it wasn’t very useful at first. We took this code, added improved reinforcement learning functionality, and heavily optimized the parameters so that it could quickly adjust beats in music with large changes in volume. One issue with this was that it occasionally registered multiple beats within a short period, so we added a parameter that prevented another beat from being identified very shortly after a beat was identified. The signal generated was used to drive changes in the visual effects applied to the Oculus in sync with the beat of a song.

Built With

Share this project:

Updates