Inspiration
The spark for Movie Colab VR came from observing a massive disparity in the film industry. On big-budget Hollywood productions, reviews happen in $5-10 million screening rooms, allowing directors and cinematographers to see the film exactly as the audience will. But in the age of remote work, most creators end up reviewing critical shots on 13-inch laptops via web links.
We realized that while web reviews are functional, they lack the immersion required for accurate creative decision-making. We wanted to democratize the high-end screening room experience and make it accessible to anyone with a Quest headset, regardless of where they are in the world.
What it does
Movie Colab VR is not just a video player; it is a collaborative workspace. It allows remote teams (Producer, Director, DP) to converge in a virtual theater.
But our biggest innovation is the "data round-trip." In traditional workflows, someone has to frantically take notes during a review. In our app, the conversation is the documentation. As users speak and annotate frames in VR, our system records, timestamps, and transcribes the feedback. These notes are then automatically synced back to the Movie Colab web dashboard as actionable tasks.
How we built it
We built the application for the Meta Quest platform using Unity Engine. The core challenge was creating a seamless multiplayer experience where high-resolution video playback stays perfectly synced across all users. We utilized Unity Game Service/Unity Relay Transport to handle the user presence and voice chat.
For the "Round Trip" feature, we integrated the VR app with our existing Movie Colab web backend. We captured audio and spatial annotation data in real-time, processed it through a speech-to-text engine, and routed it via REST to populate the project session review on the web.
Challenges we ran into
One of the hardest parts was the UI/UX of bridging VR and 2D web workflows. We had to figure out how to display complex project statuses in VR without cluttering the immersive view. Additionally, ensuring that the voice-to-text transcription was accurate enough to be useful in a professional setting required significant tweaking of our audio capture logic.
Accomplishments that we're proud of
We are most proud of the "Round Trip" workflow. Seeing a user speak a feedback note in VR and watching it appear as a note on their producer’s laptop screen felt like magic. It validates our belief that VR can be a serious productivity tool, not just a consumption device.
What we learned
We learned that "presence" is about more than just seeing an avatar; it's about shared context. By synchronizing not just the video, but the data surrounding the video, we created a tool that actually speeds up production cycles.
What's next for Movie Colab VR
Our next step is "Context Engineering." Since we are capturing rich data (voice, visual annotations, and specific frame timestamps), we plan to use this to translate into artist tasks and to use this dataset to train AI agents. Eventually, these agents could assist in the review process, proactively suggesting edits or flagging continuity errors based on previous reviews.
Built With
- google-cloud
- postgresql
- quest
- react
- rest
- unity
- vivox

Log in or sign up for Devpost to join the conversation.