About the Project: Spatial Blueprint (Bio-Research Lab)

Welcome to the Spatial Blueprint Bio-Research Lab, an immersive 3D experience designed to merge the tension of stealth gameplay with the aesthetics of a high-stakes scientific facility.

🧪 Inspiration

The project was born from a fascination with "Mad Science" aesthetics and the potential for WebXR to create high-presence educational/escape-room simulations. I wanted to build a space that felt clinical yet claustrophobic, where light and shadow aren't just decorative but fundamental to the gameplay loop. The goal was to demonstrate how modern web technologies can deliver console-like atmosphere and interactivity directly in the browser.

🛠️ How I Built It

The foundation of the project is a modern React stack optimized for spatial computing:

  • Rendering: React Three Fiber (R3F) handles the scene graph, allowing for a declarative approach to 3D.
  • Physics: React Three Rapier provides real-time collisions and rigid-body dynamics for the lab props and player movement.
  • Immersion: WebXR support (via @react-three/xr) enables users to step inside the lab using VR headsets.
  • Post-Processing: Uses Bloom, Vignette, and Noise to create a cinematic, "found footage" horror aesthetic.

Mathematical Foundations

Physics and AI logic rely on fundamental spatial mathematics. For instance, the detection logic for the Patrol Enemy calculates the Euclidean distance between the enemy and the player:

$$d(P, E) = \sqrt{(x_P - x_E)^2 + (z_P - z_E)^2}$$

Where:

  • $P(x_P, z_P)$ is the Player's position.
  • $E(x_E, z_E)$ is the Enemy's position.

If $d(P, E) < \text{threshold}$, the "Caught" state is triggered.

Similarly, the lighting intensity $I$ at a distance $r$ from the lab's emergency point lights follows the Inverse Square Law:

$$I \propto \frac{1}{r^2}$$

🧠 What I Learned

  • Spatial State Management: Using Zustand to sync gameplay states (like keycard collection and incision counts) across React components and the 3D scene graph.
  • WebXR UX: Designing interactions that work equally well with a mouse/keyboard (FPS style) and VR controllers (teleportation/grabbing).
  • Asset Optimization: Balancing high-fidelity shadows with performance, utilizing PCFShadowMap for softened edges.

🚧 Challenges Faced

  1. Performance Bottlenecks: Rendering a detailed lab with multiple physics-enabled objects was taxing on mobile VR hardware. I solved this by implementing selective shadow casting—only critical objects like the LabTray cast shadows.
  2. The "Sticky" Scroll Context: During development, an overflow-x-hidden property on the main container broke the position: sticky logic for certain UI overlays. I had to refactor the CSS architecture to ensure the canvas remained viewport-locked while maintaining a clean UI.
  3. Lighting Performance: Moving from a single DirectionalLight to a dynamic PointLight (flashlight) system required careful management of light distances to prevent frame drops in VR.

Built with passion for the Logitech Spatial Challenge.

Built With

Share this project:

Updates