Inspiration

Healthcare workers spend years learning about neurological conditions — but never truly feel what their patients experience. Dementia, Parkinson's, and anxiety disorder affect hundreds of millions worldwide, yet remain deeply misunderstood. We asked: what if a nurse could feel the tremors before treating them? What if a caregiver could hear the confusion before responding to it? EmpathyLens XR was born from that question.

What it does

EmpathyLens XR is an immersive WebXR experience that puts healthcare workers and caregivers inside the lived reality of neurological conditions — no headset required.

  • 🧠 Dementia (Margaret, 78) — greyscale vision, memory flash overlays ("Where am I?"), disorientation drift, and an AI patient who repeats herself and asks for her daughter
  • 🫀 Parkinson's (Robert, 71) — continuous screen tremor simulation with burst intensification, and an AI patient with slow, drawn-out speech
  • 😰 Anxiety Disorder (Sarah, 34) — live BPM counter rising to 118, pulsing heartbeat vignette via Web Audio API, and a rushed, worried AI patient

Each simulation includes rotating educational caregiver tips and a real-time AI chat powered by Groq LLaMA 3.1.

How we built it

  • A-Frame 1.4 for the WebXR 3D hospital room environment
  • Groq API (LLaMA 3.1) for real-time AI patient conversations with condition-specific personalities
  • Web Audio API for procedurally generated heartbeat sounds
  • Vanilla JavaScript for all condition effects, animations, and simulation logic
  • CSS Animations for visual overlays (vignettes, tremors, desaturation)
  • GitHub Pages for zero-cost deployment

Challenges we ran into

  • Simulating realistic tremors using pure CSS and JS without degrading performance
  • Crafting AI patient personas that felt authentic and medically accurate without being distressing
  • Making WebXR work smoothly across mobile, tablet, and desktop without a headset
  • Balancing immersion with sensitivity — ensuring simulations educate rather than trivialise

Accomplishments that we're proud of

  • Built a fully functional, medically-grounded empathy tool in a single day with zero budget
  • Three distinct, deeply immersive simulations each with unique visual, audio, and AI elements
  • A live BPM counter + heartbeat audio that genuinely makes users feel anxious — mission accomplished
  • 100% browser-based, no app install, no headset needed — accessible to anyone anywhere

What we learned

  • Empathy can be engineered — the right combination of visuals, audio, and conversation creates genuine emotional understanding
  • AI personas need careful prompt engineering to stay in character while remaining safe and educational
  • WebXR is far more powerful than most developers realise, even without expensive hardware

What's next for EmpathyLens XR

  • 🎙️ Voice-based AI patient conversations (speak directly to the patient)
  • 🥽 Full VR headset support for deeper immersion
  • 📋 Post-simulation reflection prompts and empathy scoring
  • 🏥 Partnerships with medical schools and nursing programs
  • 🌍 More conditions: stroke, PTSD, chronic pain, autism spectrum

Built With

  • a-frame
  • css-animations
  • github
  • groq
  • llama-3.1
  • vanilla-javascript
  • web-audio-api
  • webxr
Share this project:

Updates