Inspiration

Healthcare workers dedicate their lives to caring for patients — yet they often have no way to truly feel what those patients experience. Conditions like dementia, Parkinson's, and anxiety are deeply misunderstood not from lack of compassion, but from lack of perspective. We wanted to change that by letting caregivers step inside a patient's world, even for just a few minutes.

What it does

EmpathyLens XR is an immersive WebXR experience that simulates neurological conditions from the patient's point of view. Users are placed inside a 3D hospital room and experience condition-specific visual and audio effects — memory flash overlays and disorientation for dementia, continuous screen tremors for Parkinson's, and a rising BPM counter with pulsing heartbeat for anxiety disorder. Each simulation also features an AI-powered patient (Margaret, Robert, or Sarah) that users can have real conversations with, responding in character using Groq's LLaMA 3.1.

How we built it

We built EmpathyLens XR entirely with web technologies — no native app, no headset required. The 3D environment runs on A-Frame 1.4 (WebXR), with all condition effects written in vanilla JavaScript and CSS animations. Heartbeat audio is generated live via the Web Audio API. AI patient conversations are powered by the Groq API (LLaMA 3.1), with each patient given a distinct personality, speech pattern, and backstory through system prompting. The whole project is hosted for free on GitHub Pages.

Challenges we ran into

  • Simulating neurological effects convincingly without being exploitative or inaccurate required careful research and design balance.
  • Keeping the AI patient responses consistently in-character while remaining medically empathetic was tricky to prompt-engineer correctly.
  • Making tremor and heartbeat effects feel immersive on both desktop and mobile without impacting performance took several iterations.
  • Integrating Web Audio API for real-time heartbeat generation with dynamic BPM synced to visual animations required precise timing logic.

Accomplishments that we're proud of

  • Built a fully functional, immersive XR experience in one day with zero budget.
  • Created three distinct, research-informed simulations that genuinely evoke empathy.
  • Each AI patient has a unique voice — Margaret repeats herself and asks for her daughter, Robert speaks slowly, Sarah rushes with worry — making conversations feel real.
  • The experience works on any modern browser, on any device, with no installation needed.

What we learned

  • Empathy-driven design requires deep respect for the communities being represented — every effect had to serve understanding, not spectacle.
  • WebXR and A-Frame are surprisingly powerful for rapid prototyping of immersive healthcare tools.
  • System prompting for emotionally nuanced AI characters is an art — tone, pacing, and vocabulary matter as much as factual accuracy.
  • Accessibility and zero-barrier entry (no headset, no install) dramatically increases the potential impact of XR health tools.

What's next for EmpathyLens XR

  • Add more conditions: chronic pain, ADHD, depression, and stroke recovery simulations.
  • Introduce voice-based conversations with the AI patients using speech-to-text.
  • Build a structured training module with pre/post empathy assessments for medical institutions.
  • Partner with nursing schools and care homes to pilot the experience in real training programs.
  • Explore native VR headset support for even deeper immersion.

Built With

  • a-frame-1.4
  • css-animations
  • github
  • groq
  • llama-3.1
  • vanilla-javascript
  • web-audio-api
  • webxr
Share this project:

Updates