Inspiration

The biggest problem is lack of information from delays in accessing patients histories and present information gets lost. 1/3 patients are at risk of treatment mistake in ERs and 40% are critical. And this starts in the ambulance. Paramedics see and do everything; by the time the patient reaches the ER, a lot of that context is lost. Studies put missing information in ambulance-to-ER handovers at around 30%. That means wrong or delayed treatments, duplicated work, and risk that could be avoided.

There needs to be a better way to find and request information autonomously, accuratly document and stream it right to the hospital so medics can keep their hands and attention on the patient. This is why we built RELAY.

What it does

Relay is an AI-powered emergency response system that turns paramedic voice and background data into a single, real-time handover from ambulance to ER. It addresses the past, present, and future of the patient in one pipeline: so the ER sees the full picture even before the patient arrives.

Devices: laptop with microphone in the ambulance, and a browser-based dashboard in the hospital.

Past: patient history (hands-off search)

Today, finding past context (e.g., allergies, conditions, medications, prior care, etc.,) often means nurses or medics calling GPs and chasing records by phone, or typing notes into official documentation needed already during the ambulance drive. We outsource that to agents so medics stay with the patient. Once we have core information (full name, address, age, gender) from the paramedic’s voice, we trigger in parallel:

  1. GP outreach: We resolve the GP/practice (e.g. lookup number by name and location), place a call (ElevenLabs + Twilio), and request data. No medic on the phone.
  2. Medical history: We query FHIR R4 (e.g. Synthea for dev; in production, HIE or Particle Health–style APIs) for corresponding patient records. So the team knows before arrival what drugs are safe and what to avoid.
  3. Relative data: We stream the relative data to the hospital much faster such that they can be connected sooner and provide information as they are one of the most critical sources. We decided not to use artificial agents here as this requires sensitive handling. All of that is merged into one patient story. Hands-off: documentation, requesting, and consolidation are done by the system and agents, not by someone on a keyboard or phone.

Present: what’s happening now (voice → structure, no scribbling)

Voice is the only input. A wearable mic streams the audio records of the treating medic. NEMSIS-compliant information is extracted (e.g., demographics, vitals, procedures, medications, impressions) so the ePCR fills itself as the paramedic speaks. No writing, no tapping; such that the medics can keep their hands on the patient through the full ride.

Future: what the hospital needs to be ready (stream + speed)

Right now the hospital is often called 1–2 minutes out if the medics have the time and gets the full ePCR at the end. We flip that: we stream information to the hospital en route. The hospital sees a live dashboard over WebSockets—active cases, NEMSIS updates, GP and medical-DB responses—and interfaces for urgency-based summaries for the handover. So the ER can prepare earlier: right team, right room, right drugs. Speed comes from (1) parallel GP + DB fetch, (2) continuous stream instead of one report at the end, and (3) less manual documentation so care isn’t delayed.

How it fits together

Handover gap: By recording and structuring everything from voice and pulling history in parallel, we aim for a complete handover and lower the risk of missing information. Delay in care: Because there’s only so much one human can do on a ride, we outsource documentation and information-gathering to the system and agents; more information is available faster and more accurately. Delay in hospital prep: We use a stream of information so the hospital can prep quicker. Human labor and error: We reduce calling around, handwritten notes, and late data entry by using secure electronic flows and agents to call, orchestrate, search, and consolidate.

How we built it

We built an end‑to‑end ambulance‑to‑hospital pipeline that streams hands‑free voice capture, extracts structured clinical data, and pushes a live hospital prep dashboard.

Core stack

  • Frontend: HTML/CSS/JS dashboards for paramedic + hospital views, real‑time updates via WebSockets.
  • Backend: FastAPI + WebSockets, event bus for fan‑out, SQLite for persistence.
  • AI: LLM‑based NEMSIS ePCR extraction and hospital summaries (Anthropic/OpenAI/Modal‑ready via a model‑agnostic client, Perplexity for targeted search).
  • Voice: ElevenLabs streaming transcription.
  • Data: NEMSIS‑compliant JSON, transcript segments with timestamps, and vitals playback for realism.

Pipeline

  1. Wearable mic streams audio to the backend, producing continuous, timestamped segments.
  2. ElevenLabs transcribes in real time, merging partials into committed transcript updates.
  3. Each committed chunk updates a NEMSIS‑compliant ePCR record (patient info, vitals, impressions, procedures, meds).
  4. Structured data is persisted and pushed over WebSockets so the hospital view stays 1:1 with the ambulance stream.
  5. Once core identity is complete (name, age/DOB, address, gender), the system triggers parallel enrichment: medical history lookup and GP outreach.
  6. The hospital dashboard renders an inbound critical brief, contraindication warnings, prep actions, live vitals trends, and a full NEMSIS ePCR tab for deeper drill‑down.

Challenges we ran into

  • Real‑time coordination: Keeping transcript, NEMSIS extraction, and hospital UI in sync with low latency.
  • Data completeness vs. speed: Balancing early summaries with partial information without misleading clinicians.
  • UI clarity under pressure: Designing a dashboard that is dense but still readable in seconds.
  • Integration complexity: Orchestrating voice, LLM extraction, and streaming updates reliably across the pipeline.

Accomplishments that we're proud of

  1. A hands‑free, end‑to‑end pipeline from ambulance voice to a live hospital dashboard.
  2. NEMSIS‑compliant structured extraction updated in real time.
  3. A hospital UI that highlights critical prep actions and contraindications immediately.
  4. A system that streams the ePCR 1:1 to the ER before arrival.

What we learned

Software engineering is dead.

What's next for Relay – Voice, Search, Stream: Save patients from rigs to ER

  1. Production‑grade integrations with HIEs and hospital EHR systems.
  2. Clinical decision support tuned to EMS protocols and ER workflows.
  3. Voice‑first clinician queries for fast retrieval of critical facts.
  4. Hospital prep automation (stroke, STEMI, trauma activations) with tighter rules and audit trails.
  5. Security + compliance hardening for real clinical deployment.
  6. A custom wearable, 3D printed including the microphone.
  7. Adding AI-based predictive and diagnostic tools for early warning systems.

Built With

Share this project:

Updates