AXION-140: The Last Signal 🎯 Project Overview AXION-140 is a 2-minute psychological sci-fi VR experience that challenges the boundary between safety and peril. Players assume the role of a rookie astronaut monitoring a routine mission that descends into cosmic horror. Through immersive spatial audio, interactive controls, and cinematic storytelling, users experience the chilling realization that isolation in space means no one can hear you scream.
✨ Features Immersive VR Environment: Fully realized spacecraft cabin overlooking an alien planet with dynamic lighting and atmospheric effects Spatial Audio Design: 3D positional audio that tracks character movement, radio interference, and environmental anomalies to heighten tension Interactive Gameplay: Control the EMI exploration robot through physical button inputs and lever mechanics within the cabin Dynamic Narrative: Real-time radio communications with crew members create an evolving story that responds to player actions Cinematic Camera System: POV feeds from crew helmet cams displayed on in-cabin monitors, creating a found-footage aesthetic Psychological Horror Elements: Gradual signal degradation, visual glitches, and environmental storytelling build atmospheric dread
💡 Inspiration AXION-140 explores the psychology of false security—the human tendency to feel safe in familiar environments even as danger approaches. Inspired by classic sci-fi horror like Alien, Event Horizon, and 2001: A Space Odyssey, we wanted to create a VR experience where the player's perceived safety within the spacecraft cabin becomes their prison. The project examines: Isolation and Helplessness: Being unable to physically help endangered companions Technology Failure: The terror when reliable systems begin to malfunction The Unknown: Confronting entities beyond human comprehension False Control: The illusion of agency through interactive elements that ultimately cannot prevent catastrophe We aimed to prove that VR horror doesn't require jump scares—true fear comes from atmosphere, helplessness, and the slow realization that something is terribly wrong.
🚀 What It Does AXION-140 places users inside the cabin of a spacecraft on an alien world during their first space mission. As a rookie astronaut, you: Monitor a Routine Mission: Your experienced crewmates Barbara and Ben venture into an abandoned space station to replace a reactor core Operate Critical Systems: Control the EMI emergency robot via physical buttons and levers within the cabin Experience Escalating Tension: Watch as routine communications deteriorate into static-filled warnings Witness the Unknown: Observe unexplained phenomena through multiple camera feeds Face the Inevitable: Discover you're not alone—and never were The experience lasts 2-4 minutes, delivering a complete narrative arc from mundane mission briefing to psychological terror, all while the player remains physically seated within the safety of their cabin—or so they believe. Key Story Beats: Mission briefing with lighthearted banter between crew Deployment of EMI robot to investigate the station Progressive signal interference and rising ICP readings Communication breakdown and visual glitches Catastrophic light beam event from the station Final confrontation with otherworldly entities
🛠️ How We Built It AXION-140 leverages cutting-edge AI and traditional tools to create a production-quality VR experience: 3D Asset Generation Meshy AI: Rapid generation of 3D models for the spacecraft interior, control panels, EMI robot, and abandoned station architecture Blender: Manual refinement, UV unwrapping, and optimization of AI-generated assets for VR performance Visual Development Sora & Nano Banana: Created concept visualizations, mood boards, and creative ideation to establish the emerald planet aesthetic, lighting conditions, and atmospheric color palette Unity URP: Real-time rendering with custom shaders for helmet HUD effects, monitor displays, and signal interference Audio Production Eleven Labs: Generated natural voice performances for Barbara and Ben with emotional range spanning calm professionalism to terror Unity Spatial Audio: Implemented 3D audio positioning, radio filters, and progressive signal degradation effects Custom Sound Design: Layered ambient station hum, ship systems, and alien presence audio cues Animation & Motion Mixamo: Sourced base astronaut animations for walking, gesturing, and character movement Blender: Baked and sequenced all animations into a cohesive timeline, refined character movements, created EMI robot animations, and designed unnatural alien entity movements Unity Timeline: Integrated pre-baked animation sequences with dialogue triggers and interactive moments Interactivity & Programming Unity with C#: Built the core interaction system, button physics, lever mechanics, and state management XR Interaction Toolkit: Implemented VR hand tracking and haptic feedback for tactile button presses Custom Dialogue System: Created dynamic audio playback synced with animation states and player actions Technical Pipeline Concept (Sora/Nano Banana) → 3D Models (Meshy) → Refinement (Blender) → Animations (Mixamo/Blender) → Voice (ElevenLabs) → Integration (Unity) → VR Optimization (Meta Quest)
Development Time: Hackathon sprint development utilizing AI tools to compress months of traditional asset creation into days.
🚧 Challenges We Overcame
- Animation Timeline Synchronization Challenge: Coordinating 20+ individual Mixamo and other elements animations with dynamic dialogue timing and player-triggered events created complex state management issues. Solution: We developed a custom Blender timeline baking workflow that pre-synchronized all character animations, then used Unity's Timeline with activation tracks to trigger baked sequences. This reduced runtime complexity while maintaining narrative flexibility.
- VR Performance Optimization on Mobile Hardware Challenge: Meta Quest 2's mobile chipset struggled with multiple animated characters, real-time lighting, and particle effects simultaneously. Solution: Implemented aggressive LOD systems, baked lighting for static elements, used texture atlasing to reduce draw calls, and optimized shaders to maintain 90fps. We reduced poly count by 60% while preserving visual fidelity through smart normal mapping.
- AI Asset Quality Control Challenge: Meshy AI generated inconsistent topology and non-VR-optimized meshes that caused performance issues and UV mapping problems. Solution: Established a rigorous post-processing pipeline in Blender: retopology for hero assets, automated UV unwrapping scripts, and PBR material standardization. This turned AI outputs into production-ready assets.
- Emotional Pacing in Fixed Duration Challenge: Compressing a complete emotional arc (calm → tension → terror) into exactly 2-4 minutes while allowing player interaction freedom. Future Solution: a "rubber band" timing system where certain story beats wait for player input (button presses) but others trigger on timers with catchup logic. This ensures the experience feels interactive while maintaining narrative momentum.
💡 What We Learned The Biggest Lesson: AI is a Collaborator, Not a Replacement The most profound insight from this project was understanding that AI tools are force multipliers for creativity, not shortcuts around craftsmanship. Specific Learnings: AI Outputs Require Human Curation: Every Meshy model needed manual refinement. Every ElevenLabs voice line needed direction and emotional tuning. AI gave us 70% of the way there—the final 30% required artistic judgment that only humans provide. Technical Constraints Breed Innovation: Being forced to optimize for Quest 2 hardware made us better developers. We learned aggressive optimization techniques (batching, occlusion culling, shader optimization) that will benefit all future projects. Narrative Design Trumps Technical Complexity: Early prototypes had branching dialogue and multiple endings. Cutting these for a focused, linear 2-minute experience actually increased emotional impact. Sometimes less is more. User Testing is Irreplaceable: Our assumptions about "scary" moments were wrong. What we thought would terrify players (aliens appearing) was less effective than subtler elements (radio static, rising ICP numbers). You cannot design fear in a vacuum—you need real reactions. Cross-Tool Integration is the Real Challenge: Individual tools (Unity, Blender, ElevenLabs) worked great. Making them communicate seamlessly was 50% of development time. Establishing clear file naming conventions and asset pipelines early would have saved days.
🔄 What We'd Do Differently Next Time
- Implement Real-Time Multiplayer from Day One Why it matters: Asymmetric co-op (one player in cabin, one as crew member) would exponentially increase replayability and emotional investment. How we'd do it: Use Unity Netcode for GameObjects with low-latency voice chat via Vivox. The cabin player could warn the crew member about dangers visible only on monitors, creating genuine co-op tension. Hackathon Value: Demonstrates scalability, networked systems expertise, and understanding of social VR trends.
- Build a Modular Scenario System Why it matters: Currently, the entire experience is hard-coded. A modular system would allow rapid creation of new episodes or community-generated content. How we'd do it: Implement a JSON-driven scenario format defining: Dialogue sequences with timing windows Interactive object states and triggers Environmental events (lights, sounds, effects) Victory/failure conditions Hackathon Value: Shows software architecture thinking, extensibility, and product roadmap vision beyond the demo.
- Implement Procedural Audio Generation Why it matters: Pre-recorded dialogue limits localization and iteration. Real-time TTS with emotional control would enable dynamic responses. How we'd do it: Integrate ElevenLabs API for runtime voice generation with emotion parameters driven by game state. Cache frequently-used lines for performance. Hackathon Value: Shows API integration skills, internationalization thinking, and understanding of AI limitations (latency, cost management).
Create a Comprehensive Analytics Dashboard Why it matters: Without data on where players look, what they click, and when they disengage, we're designing blind. How we'd do it: Implement heatmap tracking for gaze direction, interaction timestamps, and session completion rates. Use Unity Analytics or custom backend to aggregate data. Hackathon Value: Demonstrates data-driven design methodology, user research competence, and iterative development practices.
Establish Automated Testing Pipeline Why it matters: Manual VR testing is time-consuming. Automated tests would catch performance regressions and broken interactions faster. How we'd do it: Use Unity Test Framework for unit tests on interaction systems, automated profiling for framerate checks, and CI/CD pipeline (GitHub Actions) for build validation. Hackathon Value: Shows professional development practices, quality assurance understanding, and scalability awareness.
Design for Accessibility from Start Why it matters: VR experiences often exclude users with motion sensitivity, hearing impairments, or limited mobility. How we'd do it: Implement comfort vignetting options Add subtitles with directional indicators Create controller-free gaze-based interaction mode Include difficulty settings (scary vs. suspenseful modes) Hackathon Value: Demonstrates inclusive design philosophy, awareness of XR accessibility standards, and broader market thinking.
🏆 Accomplishments That We're Proud Of Innovation in Narrative VR We've pioneered a hybrid storytelling format that sits between cinematic film and interactive gaming. Unlike traditional VR games that emphasize exploration or combat, AXION-140 creates tension through enforced perspective—you cannot save your crew, only witness their fate. This subverts player expectations and creates genuine emotional impact. AI-Accelerated Production By leveraging multiple AI tools cohesively, we achieved production values typically requiring dedicated 3D artists, voice actors, and concept artists. This demonstrates that small teams can now create AAA-quality VR experiences using AI as a force multiplier while maintaining artistic control in less than a week. Psychological Horror in VR We successfully translated classic sci-fi horror tropes into VR without relying on cheap jump scares. The gradual build of tension through: Signal degradation (auditory unease) ICP reading increases (narrative dread) Visual glitching (perceptual unreliability) Environmental anomalies (spatial wrongness) Creates a more memorable and impactful experience than sudden shocks. Accessible VR Design AXION-140 works for VR newcomers because it's a seated experience with simple interactions, fixed perspective, and short 2-4 minute duration. This makes it ideal for demos, festivals, and introducing non-gamers to VR storytelling. Technical Achievement Maintained 90fps on Meta Quest 2 hardware through aggressive optimization Implemented seamless state-based dialogue system that adapts to player timing Created believable AI companions through synchronized animation, audio, and interaction Designed modular scene system allowing rapid iteration and future episode expansion Cross-Disciplinary Fusion Successfully merged: Film language (cinematography, pacing, framing) Game design (interactivity, player agency, feedback loops) Theater (environmental storytelling, spatial audio) VR affordances (presence, embodiment, immersion) Into a cohesive experience that leverages the strengths of each medium.
🎮 Target Platforms Primary: Meta Quest 2/3 (standalone VR) Secondary: PCVR (SteamVR, Oculus Rift) Future: PSVR2, Apple Vision Pro
📊 Impact & Applications AXION-140 demonstrates applications beyond entertainment: Training Simulations: High-stress decision-making scenarios Psychological Research: Studying helplessness and fear responses Film Pre-Visualization: Testing cinematic VR storytelling techniques Educational potential: Could be applied to learn space mission protocols through immersive failure scenarios
🔮 What's Next Episode 2: Aftermath exploration—investigating what happened Multiplayer Mode: Asymmetric co-op with one player in cabin, another as crew Branching Narrative: Player choices affecting crew survival Expanded Universe: Discovering what AXION 1-139 encountered Creator Tools: Allow community to create their own cabin-based horror scenarios





Log in or sign up for Devpost to join the conversation.