Inspiration

We share our homes with pets, yet their perspective remains a mystery. Animal Vision MR gamifies perception, sparking imagination by allowing users to instantly swap their biological lens for that of a Dog, Cat, or Shark. We aim to prove that MR can transform a mundane living room into a vibrant playground without complex virtual environments.

What it does

Animal Vision MR is an immersive entertainment app that applies stylized "Vision Filters" to the real world using Meta Quest 3 Passthrough.

  • The Magic Moment: Stand in your room, use a hand gesture to select "Shark," and watch reality submerge into a deep-sea blue, accompanied by the sound of waves.
  • Artistic Filters: Includes Dog Vision (yellow/blue contrast), Shark Vision (monochromatic blue), and Snake Vision (thermal sensor style).
  • Audio Synergy: We pair every visual shift with spatial SFX (e.g., panting or underwater ambience) to sell the illusion.
    • Note: These are artistic interpretations, not strict scientific simulations.

How we built it

We developed this using Unity 6 and the Meta Presence Platform.

  • Passthrough API & Color LUTs: We used Color Look-Up Tables (LUTs) to alter the tone of the Passthrough footage while keeping virtual objects untouched.
  • AI-Assisted Workflow: To maximize efficiency, we used Generative AI to define color parameters and generate the base LUT textures, requiring only minor manual tuning.
  • Hand Tracking: Using the Meta Interaction SDK, we implemented intuitive pinch gestures, allowing users to switch modes without controllers.

Challenges we ran into

We faced a critical "Vision Lock" bug where users would get stuck in a specific animal filter. The Solution: The issue was rooted in memory management. We solved it by strictly managing the lifecycle of LUT objects, ensuring color map resources were disposed of correctly during every mode switch.

Accomplishments that we're proud of

  1. Lightweight: By relying on Meta's native LUT feature instead of heavy post-processing, performance is highly efficient.
  2. AI Integration: We successfully used AI to accelerate our texture generation pipeline.
  3. Stability: We overcame resource management hurdles to create a stable, crash-free experience.

What we learned

  • Passthrough is a Canvas: It is a programmable texture, not just a background.
  • Sound Sells the Sight: A blue filter is just blue. A blue filter plus underwater audio is a Shark.

What's next for Animal Vision MR

We have successfully launched V1.0 on the Store. Our roadmap focuses on transforming this "Visual Experience" into a "Phygital Game Platform."

  1. Passthrough Camera API: Implementing complex effects like Insect compound eyes (distortion/multiplication).
  2. Scene API: Using the room mesh to hide virtual "prey" behind real furniture for a hide-and-seek game.
  3. Asymmetric Multiplayer: A mode where a mobile user places virtual scent trails that the MR user can only track using specific animal filters.

Built With

Share this project:

Updates