🏥 Before You Go — Step Into the Doctor's Shoes

Inspiration

Medical procedures are terrifying for children. Not because they're always painful, but because they're unknown. A needle and a bandage change, these are routine for doctors but completely foreign and frightening for a 7-year-old lying in a hospital bed.

I built Before You Go standing inside Shriners Hospitals for Children. The question that drove me was simple: what if the child already knew exactly what was going to happen because they'd done it themselves?


What It Does

Before You Go is a VR experience for pediatric patients that lets children step into the doctor's shoes before their procedure.

The child puts on a Meta Quest 3 headset and enters a calm hospital waiting room. They choose from 2 procedure cards (more to come):

  • 💉 Blood Draw
  • 🩹 Dressing Change

They then enter a hospital room where they are the doctor. They pick up the tools, walk to the patient, and perform the procedure themselves, guided by simple step-by-step instructions floating in their view.

For this hackathon I fully built and implemented two procedures: Blood Draw and Dressing Change.

When they sit down in the real chair minutes later, nothing is a surprise.


How I Built It

Layer Technology
VR Engine Unity 6.3 LTS
Headset Meta Quest 3
XR Framework OpenXR + XR Interaction Toolkit 3.3.1
Rendering Universal Render Pipeline (URP)
UI World Space Canvas + Tracked Device Graphic Raycaster
3D Assets Sketchfab + Unity Asset Store

I built the entire experience solo in 48 hours, from an empty Unity project to a fully deployable APK running natively on the Quest 3.

The architecture is scene-based: a lobby scene where the child selects their procedure, and individual procedure scenes for each interaction. All scenes use a shared StepManager system that advances instruction images as the player completes each action.


Challenges I Faced

XR Interaction complexity — Getting UI buttons, grabbable objects, and 3D interactables to all work simultaneously on the same controller was my biggest technical challenge. World Space canvas interaction in VR requires a specific combination of Tracked Device Graphic Raycaster, XR Interaction Manager, and Input Action Manager that isn't well documented.

48-hour scope management — I had to make hard cuts. Hand animations, full biometric integration, and the remaining 3 procedure scenes were all scoped out in favor of a polished, working two-procedure demo.

Child-appropriate design — Every piece of UI, every instruction, every interaction had to be friendly and non-threatening. I iterated on the visual language multiple times to ensure nothing felt clinical or scary.


What I Learned

  • Perspective-taking is one of the most powerful anxiety-reduction tools in pediatric psychology and VR is the perfect medium for it
  • The simplest interaction mechanic that works is always better than a clever one that doesn't
  • Building at a children's hospital changes how you think about every design decision

What's Next

  • Brand new art style built from the ground up
  • Multilingual support (French, Spanish, Arabic)
  • Remaining 3 procedure scenes

Built With

+ 2 more
Share this project:

Updates