Inspiration

55 million people live with dementia. 10 million new Parkinson's cases emerge every year. 284 million people struggle with anxiety disorders. Yet healthcare workers and caregivers often train without ever truly feeling what their patients experience daily.

The inspiration was simple but powerful: empathy cannot be taught — it must be felt. We wanted to build something that lets a nurse, doctor, or medical student genuinely experience the disorientation of dementia, the frustration of Parkinson's tremors, or the racing heart of severe anxiety — before they ever walk into a patient's room.

What It Does

EmpathyLens XR is a browser-based WebXR empathy training tool that places you inside a 3D hospital room and simulates three neurological conditions from the patient's perspective:

🧠 Dementia (Margaret, 78)

  • Memory flash overlays ("Where am I?", "What day is it?")
  • Visual greyscale desaturation and disorientation drift
  • AI patient who repeats herself, trails off mid-sentence, and asks for her daughter

🫀 Parkinson's Disease (Robert, 71)

  • Continuous screen tremor simulation with burst intensification
  • Orange vignette and motion blur effects
  • AI patient with slow, drawn-out, frustrated speech

😰 Anxiety Disorder (Sarah, 34)

  • Live BPM counter rising from 72 → 118 BPM
  • Pulsing red vignette synced to a Web Audio API heartbeat
  • AI patient speaking in rushed, panicked sentences

Each simulation includes rotating educational tips for caregivers and a floating AI chat panel where you can have a real conversation with the patient — powered by Groq's LLaMA 3.1 model.

No VR headset required. Works on any modern browser, on any device.

How We Built It

We built EmpathyLens XR entirely with free, open-source tools — zero budget:

  • A-Frame 1.4 for the WebXR 3D hospital room environment
  • Groq API (LLaMA 3.1) for real-time AI patient conversations with condition-specific system prompts
  • Web Audio API for generating the anxiety heartbeat sound procedurally
  • Pure CSS animations for all visual effects (tremor, fog, vignette, memory flashes)
  • Vanilla JavaScript — no frameworks, no dependencies
  • GitHub Pages for free deployment and hosting

Each AI patient has a carefully crafted system prompt that captures their condition authentically — Margaret's confusion, Robert's physical frustration, and Sarah's racing anxiety — creating genuinely moving interactions.

Challenges We Ran Into

  • GitHub secret scanning blocked our initial push because the Groq API key was hardcoded. We solved this by switching to a localStorage-based key prompt system — secure and user-friendly.
  • A-Frame + CSS effects layering required careful z-index and pointer-events management to keep the HUD, chat panel, and 3D scene all interactive simultaneously.
  • Groq model deprecation — the llama3-8b-8192 model was decommissioned mid-build, requiring a quick switch to llama-3.1-8b-instant.
  • Chat panel UX — the initial full-screen chat obscured the immersive XR environment. We redesigned it as a compact floating panel in the corner, preserving the immersion.

Accomplishments That We're Proud Of

  • Built a fully functional, deployable WebXR app in a single day with zero budget
  • Created three distinct, emotionally resonant simulation experiences
  • The AI patients respond authentically in character — judges and testers found the conversations genuinely moving
  • The anxiety simulation's live BPM counter rising in real time creates immediate emotional impact
  • Achieved professional-grade UI/UX with glassmorphism, animations, and responsive design using only vanilla CSS

What We Learned

  • WebXR is far more accessible than people think — A-Frame makes immersive 3D experiences achievable without specialist knowledge
  • LLM system prompts can create deeply convincing character simulations when carefully crafted with medical context
  • The most impactful features were the simplest ones: a number counting up (BPM), words flashing on screen ("Where am I?"), and a screen that won't stop shaking
  • Empathy technology doesn't need expensive hardware — a browser and a good prompt is enough

What's Next for EmpathyLens XR

  • More conditions: stroke, autism sensory overload, schizophrenia, chronic pain
  • VR headset support: full stereoscopic mode for deeper immersion
  • Structured training modules: pre/post reflection prompts and empathy scoring
  • NHS/hospital pilot: partnering with healthcare institutions for formal training integration
  • Multi-language support: making the tool accessible globally
  • Voice conversations: replacing text chat with real-time voice using speech-to-text ```

Built With

Share this project:

Updates