Video Sphere setup with spot audio in space
A framework was created to allow new video scenes to added by dropping them into two lists Light and Dark
The audio sprite moves throughout the space to help the user learn how transitions work for the story to progress
Audio and Video export from Adobe Premiere Pro, Video then needs conversion to .ogg
Binaural Audio Recorder
TEAM LEADER: Keith Bradley CELL: 7078490485
Entertainment and Story Telling
Inspired by the sonic ambience around us. Binaural puts you in the exact sound field as originally intended. You can hear a bird taking off. You can hear the band exactly as they were positioned when playing.
The Verge writes: "For decades, binaural recording was a novelty, and overlooked for less technically demanding methods. But with the rise of virtual reality hardware like the Oculus Rift, HTC Vive, PlayStation VR, and Samsung’s Gear — systems dependent on realistic 3D audio to fully immerse their users — binaural audio is on the cusp of a renaissance. Binaural recording systems are unique because they emulate the workings of the human head. The architecture of our anatomy dictates how we understand the sounds we hear: with an ear on either side of a thick skull and spongy brain, we hear sounds enter our left and right ears at different times. If a dog barks by our left ear, it takes a few extra microseconds for the bark to reach the right ear; the sound will also be louder in one ear than the other. In addition, sound waves interact with the physical constitution of the listener — the pinna (or outer ear), the head, and the torso — and the surrounding space, creating listener-specific variations otherwise known as head-related transfer function. The brain scrutinizes these miniscule interaural differences of time and strength in order to localize sound with immaculate precision." via link
What it does - The Story
An immersive Sci-Fi narrative experience using immersive sounds that we created to demonstrate the power of creating 360 video experiences with realistic visuals and sound.
In that context we developed, Blue Note. The Blue Note is an immersive narrative using only natural sounds that takes our viewer through the experience of what living in parallel dimensions could feel like here in Boston. It’s a proof-of-concept where Stranger Things meets Under The Skin where on your 18th birthday you are given the power of inter-dimensional sight. Within the first few minutes of adulthood you either master your powers which manifest as a ring of fire, a tear you can create in the inter-dimensions, allowing you to see through space and time. Or you risk being stuck in the alternate dimension for eternity.
In our world the alternate dimensions is a dark reflection of the same environment. The audience will leave The Blue Note with a new understanding of the effect that sound has on our minds and consider the potential powers we can unlock within our ourself with a new way to view the world both visually and sonically.
How we built it - The Sound and Look of the Project
As a multi-disciplinary team consisting of a architectural designer, a software engineer, a 360 video entrepreneur, and filmmaker; we all came to this project to explore the full sensory immersion in virtual reality within a contained narrative.
The project was shot on location in one day on the Charles River featuring Boston street musicians.
We shot the “blue note dimension” with the Samsung Gear VR at 4k to give it a rougher, more ethereal look before adding effects in After Effects and Premiere. To capture the vividness of “our dimension” we captured it using the GoPro Omni Rig and gave it a saturated high contrast look. All of the sound was captured using The Zoom H2N, custom-made binaural microphone. For “blue note world” we mixed-down the binaural ambisonics to stereo using Hokusai 2 and finishing the sound design in Garageband. All of the sounds were captured from the real world and designed specifically for each “blue note” scene.
The sound of the “real world” is a real-time binaural capture that has been spatially mapped in post production based on the sphere surface based on the horizontal positioning system of the Oculus.
The video editing was done in Adobe Premiere and AutoPano, the GoPro capture was done with Omni Importer. The final product was built in Unity using eight spheres with the video as a movie texture of the sphere as well as extensive sound design.
Challenges we ran into
Because the audio needed to be spatial, each channel needed to be exported separately and mapped in location in each scene. This extended workflow adds a significant amount of time to exports and setup in Unity.
The capture of the Omni rig wasn’t as seamless as we had hoped and the importing took a lot of time and computing power. The Samsung Gear VR is more portable, but also captures at lower quality. Unity's limitations on video files as textures caused for some of the footage to be downgraded in this short time period. The equipment is capable of 4K or better, but videos needed to be downsized for this experiment.
We had hoped that we would have more time to explore the story and really develop the idea that you are porting yourself between two dimensions.
Beyond time and technology as challenges, we also found it difficult to seamlessly blend the jazz instruments as an auditory cue to have the viewer turn around in the “blue note” universe. Ideally we would have started with the the darkness, silence and then have the sound cue drive the audience to turn their heads to experience “our world”. To address that challenge we swapped the experience to start with the jazz musicians and then have the other “blue note world” be the contrast. We leveraged the idea of a Sci-Fi universe to overcome some of our technical disadvantages.
We had hoped that we would have more time to explore the surrounding area.
Accomplishments that we're proud of
We are proud of the the technique, result, and strong combination of a multidisciplinary team. We intended to explore a story that allows the audience to embody the same location but is moved to change their location by audio cues. We think there is tremendous opportunity to explore cutting in virtual reality and using audio more specifically and pointedly. If we start to focus on the audio, we believe the audience will be more forgiving of the video and this unlocks a tremendous opportunity for, immersive storytellers.
What we learned
We learned a tremendous amount about the workflow of 360 video capture and trying to create an interactive element inside Unity. With the short timeline of the Hackathon, prerecorded assets would have been a huge benefit to the workflow.
What's next for "Blue Note"
We want to do stereo 360 video and further develop the narrative using sound as a primary conceptual driver. We also hope to be able to use higher resolution video files, an 8K capture rig would significantly improve immersion.
A video posted by tkbrdly (@tkbrdly) on Oct 9, 2016 at 10:34am PDT