To Test for Yourself
Process your mp4 Video to receive data stream .vtt file using our dashboard below https://app.beatcaps.io/
Test video outputs on Video Workshop Conference
https://beatcaps-workshop-conference.netlify.app/ Head to the webpage. Click Join the video conference and then Begin to watch your video shake to the beat, or vibrate on mobile. Unfortunately, we did not add changing the .vtt file for other videos other than default video, but is definitely on the list of todos.
Inaccessible streaming platforms are unattainable to Deaf People. Single-sensual online content does not connect with youth. Beatcaps is our attempt to remedy this not as an accessibility compliance issue but as a customer experience project.
What it does
We used dolby.io to enhance music from video mp4 files which range from anything from movies to rhythm fitness videos. We want to process the high quality music audio from these videos using our technology Beatcaps that generate a rhythmic data stream which can be connected to a video player with visual/sensory effects to both visualize and/or provide sensory output of rhythm using existing technological formats using beats for custom experiences.
How we built it
We utilized Dolby’s Enhance API to make the music more prominent in the output video to get better results for the Beatcaps Processing. Using Fourier to do beat-tracking to identify the timings of beats on our enhanced audio and is synchronized. We then utilized the Communications API to utilize the existing Workshop Conference sample to use the videoPresentationService to output a video with cc datastream of our processed Beatcaps video interpreted to video viewer shake animation timed to beats of audio in the conference, as well as, on mobile phone vibrate to the beat.
Challenges we ran into
Initially we wanted to demonstrate another side technology we had that is based on the same principles of BeatCaps, Initially we wanted to demonstrate running a OBS live stream of a videogame streaming through the dolby platform using closed captions and WebRTC to have client watchers of the stream receive haptic feedback through mobile Vibrate API. Eventually, we got the video workshop conference sample to work and hacked away with it to get Beatcaps to run on a video and interpreting the data stream in the closed captions to do things such as animate the viewer and vibrate.
Since we were unable to get the initialization token and setup for the video conferencing platform, we opted to use the Media APIs to enhance our audio for music to be the priority so that the Beatcaps processing can generate a rhythmic data stream and output it to the video player to enhance user experience. We enhanced the music of our videos by setting the Enhance APIs settings to optimize for music content, unfortunately we were not able to isolate just music in the settings (unlike the options for speech) but we could get better audio to process for Beatcaps. We then generate a rhythmic beat captions.
Accomplishments that we're proud of
We are proud of using existing technologies/formats to be able to pass through data that can extend to haptics on the client side without having to remake a new file format. We are also proud of trying to participate in this hackathon and utilize Dolby's amazing Audio/Visual APIs that can mesh well with our Beatcaps technology for an ultimately better user experience. We are also proud of bringing accessibilities technology in the forefront of innovation for our hack to demonstrate the need for companies such as Dolby to be open in using existing accessibilities technology for enhanced customer experience, as a priority rather than as a compliance issue.
What we learned
Dolby Communications API does not utilize closed captions as an option for video streaming outside of utilizing WebRTC. We learned how to set up Communications API to Video Present and played around with the output to our desired video shake and mobile vibration. Beatcaps process is slightly delayed by a half second. We also learned that dolby has an amazing collection of audio/visual APIs. We discovered a mute() call for Video Presentation by accident which our code did not like.
What's next for Beatcaps Remastered
We want to continue innovating on the Beatcaps Remastered technology to include for live video streams and enhanced user experiences on client side, to include for AR, sensory output, as well as client side custom lighting experiences using Beatcaps/HapticCast technology. We see potential reaching out to companies such as Dolby that are in the forefront of audio/visual technologies to help integrate Beatcaps into their products.