What happens to the people we love when our dreams take us millions of miles away? Drawing from the creative team’s experience developing The Longest Goodbye (Sundance premiere, PBS Top-10 streamed), and years of collaboration with NASA astronauts, psychologists, and space-flight trainers, we saw an opportunity to translate the emotional realities of isolation, distance, and mission-driven pressure into an immersive, interactive narrative.

At the same time, the growing urgency of climate anxiety—and the generational divide it creates—felt deeply resonant. The tension between “leaving Earth to save humanity” and “saving Earth to save ourselves” shapes both the game’s drama and its worldbuilding.

Winterover became our way to explore:

  • the cost of scientific ambition,
  • the fractures within a family pulled across two planets,
  • and humanity’s relentless desire to tame nature, whether on Earth or on Mars.

Winterover is a story-driven experience, where the destiny of two worlds hinges on the fate of a single family.

🛠️ How We Built It

We built Winterover from the ground up using Unity, blending handcrafted environmental design with physically responsive gameplay and cinematic storytelling techniques.

🎭 Performance & Characters

All character performances were fully motion-captured using a hybrid pipeline combining inertial and optical capture.

Facial performances were polished in engine and retargeted to stylized, expressive rigs.

Our ensemble includes award-winning actors whose performances elevate the emotional stakes of the narrative.

🎮 Gameplay Systems

Real-world physically-based interactions (drilling, repairing, lifting, testing samples). Dynamic holographic communication system for branching dialogues. Environmental storytelling and clue-tracking designed for VR comfort.

🔊 Audio & Music

Built entirely in FMOD, giving us:

  • real-time dynamic audio responses,
  • layered emotional scoring,
  • spatialized soundscapes for the Martian environment.

🌐 Working Across Three Time Zones

Our team collaborated across North America, Europe, and the Middle East, using continuous iteration cycles to keep animation, code, and narrative aligned.

📱 Optimizing for Standalone VR

One of our biggest hurdles was bringing a visually rich, animation-heavy narrative onto the Meta Quest 2. We implemented:

  • Aggressive GPU/CPU optimization,
  • Shader simplification and LOD systems,
  • GPU-instancing strategies,
  • and real-time asset streaming to keep memory under threshold.

📚 What We Learned

We learned that immersive, gamified storytelling is far more complex than producing either a cinematic experience or a traditional video game. It requires:

  • The narrative discipline of filmmaking
  • The systemic design of game development
  • The physical interaction layer of XR
  • The emotional immediacy of live theatre

Designing a story where every physical action, dialogue choice, and environmental cue feeds into the emotional arc of the characters is exponentially harder than working in a flat medium.

We learned that:

  • Player agency must feel meaningful but never confusing.
  • Emotional stakes must survive the constraints of the platform.
  • And above all, XR storytelling demands empathy-driven design.

🚀 Future Improvements

  1. Enhanced Physics-Driven Interactions

More intuitive hand-tooling, multi-step puzzles, and embodied tasks suited for the next generation of Meta headsets.

  1. Procedural Environmental Events

Dust storms, power failures, and terraforming anomalies that meaningfully affect the narrative.

Community-Driven Feedback Loop

Partnering with educators, science museums, and the NASA community for iterative updates and new mission-based content.

Check out the trailer: https://youtu.be/lo67FobSVF8

Built With

Share this project:

Updates