TL;DR
We built a real-time neurological screening system for ER triage. Using facial symmetry, speech analysis, and automated pupillary reflex testing, it rapidly flags potential strokes and traumatic brain injuries that are often missed in early triage. Designed for speed and sensitivity, the system integrates directly into ER workflows to reduce fatal false negatives.
Inspiration
We were inspired by The Pitt and how effortlessly ER triage appears on screen. In reality, it’s chaotic, compressed, and unforgiving. Emergency rooms are designed to act fast on what is visible: bleeding, unconsciousness, obvious trauma.
What’s far more dangerous are patients who look stable but aren’t.
FAST-negative strokes and traumatic brain injuries often present with subtle neurological changes before collapsing into catastrophic outcomes. A single missed signal or delayed escalation can be fatal. We built this project to surface those signals early and fast enough to matter.
What It Does
Our system provides rapid, automated neurological screening designed to integrate directly into ER workflows.
It continuously monitors patients for:
Facial symmetry deviation → early stroke detection
Speech pattern changes → slurring or cognitive impairment
Pupillary reflex absence → traumatic brain injury or internal bleeding
All analysis runs in near real time. When risk thresholds are crossed, the system immediately alerts hospital staff. It favors sensitivity to avoid false negatives, even if it means tolerating some false positives.
Speed is central: this system is built to flag risk within seconds, not minutes, so it can be used during triage, not after.
How We Built It
We implemented a fast, multimodal sensing pipeline optimized for low latency:
Stroke Detection
Overshoot AI for facial symmetry deviation detection
ElevenLabs for speech pattern deviation and slurring analysis
Traumatic Brain Injury Detection
Overshoot AI for real-time pupil localization
A motorized light source driven by a DC motor and mechanical actuator
A control systems algorithm moves the light toward the pupil and evaluates constriction response for the pupillary reflex test
By combining vision, audio, and reflex testing, the system builds a real-time neurological risk profile without interrupting ER flow.
Challenges
Integrating multiple AI APIs under hackathon time constraints was challenging, particularly around consistent tracking outputs and synchronization across modalities. On the hardware side, 3D printing issues prevented full deployment of the rack-and-pinion mechanism during the event.
Despite this, we validated the system architecture, control logic, and real-time feasibility.
Vision
Our goal is a standalone, bedside-compatible robotic system that:
Collects audio-visual input autonomously
Processes signals using Overshoot AI and ElevenLabs
Performs automated pupillary reflex testing
Flags neurological risk instantly to ER staff
The key bottlenecks ahead are accommodating sufficient computational power while minimizing latency. Solving this enables something critical: neurological triage that moves at ER speed.
Why This Matters
In emergency medicine:
False positives slow things down
False negatives cost lives
Our system is intentionally biased toward sensitivity. It does not diagnose—it escalates. By detecting subtle neurological deterioration early and fast, it gives clinicians time they otherwise wouldn’t have.
Built With
- css
- datsets
- elevenlabs
- html
- javascript
- llm
- mern
- next.js
- overshoot
- python
- tensor-flow
- typescript
- vllm
Log in or sign up for Devpost to join the conversation.