Inspiration
The number that started everything: 24% of combat deaths are potentially survivable — not with better weapons or more personnel, but with faster, better-organized triage.
We kept coming back to that statistic. Because it doesn't describe a medicine problem. It describes a decision-making under pressure problem. The lead medic arrives at a mass casualty scene and in seconds has to hold every victim's condition in their head simultaneously — while their hands are already on the worst one. Past three victims, human working memory begins to collapse. Paper MIST cards get lost. Radio handoffs get garbled. Downstream medics make decisions based on information that's already out of date.
Nobody built a tool that fixes that. So we did.
What it does
PULSE is a perception relay for the lead medic. It watches the scene through a camera, listens through a microphone, and builds a live, structured picture of every casualty — automatically.
For each victim, PULSE identifies wounds, estimates vitals remotely from the face, and synthesizes a clinical handoff card in plain language. The moment the medic confirms a triage tag with a single tap, that information broadcasts instantly to every downstream medic, commander, and medevac crew on the network — appearing as a pinned marker in the tactical map they're already using.
The medic never stops touching the patient. PULSE handles the documentation, the relay, and the situational picture.
Three things make it different from anything that exists today:
- It works completely offline. No cell signal, no cloud, no external dependency. Everything runs on a laptop in a pack.
- The AI suggests. The medic decides. Every time. Triage tags are never assigned autonomously. The system is a decision-support tool, not a decision-making one — and that line is a hard gate in the code.
- It fits in a kit bag today. Not a drone, not a robot, not a $400,000 research platform. Commodity hardware, wearable form factor, field-deployable this year.
How we built it
We built PULSE as a tight perception pipeline running end-to-end on a single edge device — no internet, no external APIs, no cloud inference.
The system layers computer vision for person detection and wound segmentation, remote photoplethysmography for contactless heart rate and respiratory rate, speech transcription for voice notes, and a quantized language model for MIST card synthesis — all running simultaneously and all on-device. The output is a structured casualty state that feeds a live dashboard and broadcasts as standard tactical map markers over the protocol already used by military and first responder teams worldwide.
We designed the clinical logic to align with the two standards that actually govern triage in the field — TCCC for combat and SALT for civilian mass casualty — and built five hot-swappable scenario modes covering blast, fire, MVA, CBRN, and civilian MCI. Same pipeline, different clinical lens.
The entire system was scoped, architected, and built during this hackathon.
Challenges we ran into
Getting everything to run simultaneously on a single consumer GPU without dropping frames was the central engineering challenge. Each model competes for the same memory — and in a triage scenario, latency isn't an inconvenience, it's a clinical risk. We profiled and tuned every stage until the full pipeline held steady at 6–12 FPS with under 6 GB of VRAM in use.
The harder challenge was clinical. We had to make genuine decisions about where the AI's authority ends. How do you design a system that's fast enough to be useful without crossing the line into autonomous decision-making? We ended up hard-coding the constraints rather than relying on policy — certain triage categories cannot be emitted by the system, full stop. Designing those gates forced us to think carefully about what "decision support" actually means when the stakes are this high.
Accomplishments that we're proud of
We built something that works — not as a demo with mocked data, not as a slide deck with a concept architecture, but as a real pipeline that runs on real hardware and produces real clinical output.
We're proud that the system is honest about what it is. It tells the medic when confidence is low. It attributes every state change to either the AI or the human in the audit trail. It never pretends to know more than it does.
And we're proud that we stayed disciplined about scope. It would have been easy to build something impressive-looking that cuts corners on the clinical logic or the ethics. We didn't. Every design decision came back to one question: would a medic actually trust this with a patient in front of them?
We think the answer is yes. And we think that's worth more than any benchmark.
What we learned
We learned that the hardest part of building AI for high-stakes environments isn't the model — it's the interface between the model's output and the human's decision. Getting that interface right requires understanding the clinical workflow deeply enough to know exactly what information to surface, when, and with how much confidence signaling.
We also learned that "offline-first" is an underrated design constraint that makes everything harder and everything better. When you can't rely on a cloud fallback, every component has to earn its place. It forced us to be ruthless about what the system actually needs to do versus what would be nice to have.
What's next for PULSE: Perception Unified Lifesaving Segmentation Engine
The immediate path is a wearable form factor — smart glasses as the capture layer, a compact compute pack worn on the body, and mesh radio so the relay works even when WiFi doesn't. The architecture already supports it. The models are already small enough. It's an integration problem, not a research problem.
Beyond that, the roadmap runs through validated field exercises with operational medics, iterative refinement against real MASCAL scenarios, and ultimately a regulatory submission as a Class II decision-support device.
The civilian market is as large as the military one. Every city with a stadium, a subway system, or a major highway has a mass casualty plan with no tool like this in it. The same system — same hardware, same pipeline, same offline-first architecture — serves all of them.
PULSE is a hackathon prototype today. With the right investment and the right partners, it becomes the standard of care for triage relay within three years. We're not building toward that future. We're already in it.
Built With
- atak
- css3
- cuda
- faster-whisper
- grounding-dino
- html5
- huggingface-transformers
- javascript
- llama-3.2
- llama.cpp
- mediapipe
- opencv
- python
- pytorch
- sam-2.1
- websockets
- yolov8

Log in or sign up for Devpost to join the conversation.