Inspiration
Every year, hundreds of millions of pets pass away — leaving behind families with nothing but photos and fading memories. We asked ourselves: what if there was a place you could go to feel close to them again? Not a chatbot, not a photo slideshow — a real, spatial experience where your pet exists in a world you can step into. PetWorld XR was born from that idea: using AI-generated 3D worlds and immersive XR to create something deeply personal — a place of reunion, comfort, and healing.
What it does
PetWorld XR is a WebXR experience on Meta Quest 3 where users step into an AI-generated 3D environment — like a cozy living room or a sunny park — and reunite with their pet. As you walk closer, your pet notices you, reacts with excitement, and responds to your presence and touch through animations and sound. A gentle AI-powered narrator speaks to each moment, personalizing the experience based on your pet's name, breed, and personality. It's not a game. It's a memorial you can visit.
How we built it
We used World Labs Marble to generate immersive 3D environments as Gaussian splat scenes, cleaned them in SuperSplat, and rendered them in the browser using SparkJS. The XR framework is IWSDK (Meta's Immersive Web SDK), which handles locomotion, hand tracking, and spatial interaction on Quest 3. Pet models were generated with Meshy and loaded as .glb files via Three.js. For the narrator, we chain GPT-4o-mini to generate context-aware narration, then pass it through ElevenLabs text-to-speech for a warm voiceover. The entire experience runs in the browser — no app install required.
Challenges we ran into
Getting a Gaussian splat world and a rigged 3D pet model to coexist in the same scene was tricky — they're fundamentally different rendering systems (splats vs. mesh) that need to align on ground plane, scale, and depth. We also wrestled with narrator latency: the LLM-to-TTS chain can take several seconds, which breaks the emotional flow, so we had to pre-generate key lines as fallbacks. Network configuration at the hackathon venue made it difficult to test on the Quest 3 headset from our dev machines, requiring us to debug port forwarding and firewall rules on the fly.
Accomplishments that we're proud of
We built a full emotional interaction loop in under 24 hours — world generation, pet placement, proximity-based reactions, touch interaction, and narrated voiceover — all running live on a Meta Quest 3 through the browser. The moment you walk toward your pet and it turns to look at you is genuinely moving, even as a prototype. We're proud that we kept the scope tight and focused on the feeling rather than feature count.
What we learned
We learned that world models aren't just a rendering technique — they're an emotional medium. A Gaussian splat of a living room feels fundamentally different from a modeled one; it has a photographic warmth that makes the experience feel real. We also learned the practical realities of WebXR development: Three.js version conflicts, splat-mesh compositing, and the importance of testing on real hardware early and often. And we learned that in XR, less is more — a single pet turning to look at you is worth more than a hundred features.
What's next for per-ever
We want to let users upload photos of their actual pet and generate a personalized 3D model — turning Pet-Ever from a generic experience into a truly personal memorial. We're also exploring persistent worlds where you can leave objects and messages for future visits, multiplayer so families can visit together, and integration with pet memorial services. Longer term, we see this as a platform for emotional XR experiences — not just pets, but any cherished memory you want to step back into.
Built With
- elevenlab
- iwsdk
- meshy
- metaquest
- node.js
- sparkjs
- three.js
- typescript
- vite
- webxr
- worldlabmarble
Log in or sign up for Devpost to join the conversation.