Inspiration

We noticed that EMTs and paramedics had to document electronic Patient Care Reports (ePCRs) on the ambulance ride to the hospital, during the critical times where they could be taking care of the patient. We took inspiration from Anduril's Eagle Eye (XR augmentation for military), and realized that we could better assist EMTs to help save lives via XR.

What it does

From video taken in the XR app, it processes the video via a voice isolated + diarized transcript (Eleven Labs) and a VLM (Gemini) to create a comprehensive ePCR. The report is then filled out on the hospital's website utilizing a Browser Use agent, to integrate with any hospital feature. We also included a voice-activated facial recognition feature (simulated as an opt-in MyChart integration) to pull critical patient info such as medications, allergies, emergency contact, etc. in the event that a patient is unable to relay this information. Our voice agent "Eva" can be commanded to control the XR app.

How we built it

XR App:

  • Kotlin + Android Studio SDKs
  • Samsung Galaxy XR Headset (coming in glasses-form factor 2026)
  • Gesture (pinch) to record and interact, voice Eva agent via Google ASR and keywords. "Eva, who is this?Eva, log X event."

Backend:

  • FastAPI (Python) app
  • Video Processing Pipeline: ElevenLabs voice isolation to minimize EMT field noise, diarized transcript. Gemini 3.1 Flash takes in this diarized transcript, field video, and patient information (from facial rec) to provide informed visual inference on the situation to write the report.
  • Facial Recognition: ResNet-based facial recognition model, stored vector embeddings in cloud PostgreSQL database (Supabase). XR app (on voice command) takes a screenshot, and uploads screenshot to backend for facial recognition processing and vector similarity search.
  • Browser Use: Utilized open-source browser use library to fill out hospital forms. After hospital form filled out + reviewed/edited by EMT, this sends to hospital email (or, I guess whatever existing system).

Mock Hospital Forms:

  • These mock existing hospital forms for filling out ePCRs. We built two, to show that Browser Use can fill out any hospital form for easy integration.
  • Since we use Browser Use, the EMT can edit any field of the ePCR before it gets sent!!!

Challenges we ran into

  • Mostly in building the XR app. We have minimal Kotlin experience, and Samsung Galaxy XR Headsets have minimal community support for development. Specifically, creating our Eva voice agent from Google ASR with the custom "Eva" wake word was not in any example we were able to find online.

Accomplishments that we're proud of

  • The Eva voice agent
  • Facial recognition, as we think this could really help out a lot of EMTs and patients.

What we learned

  • How to write Kotlin for the Samsung Galaxy XR Headset

What's next for Eva (EMT Virtual Assistant)

  • Polish with the XR app UI/UX.
  • Integrating with actual hospital systems + MyChart
  • Getting some user testing!

Built With

Share this project:

Updates