FamilyBridge: Emergency Triage for the Families We Love
InspirationThe heart of this project lies in the High Atlas mountains and the quiet nights in Morocco. Both of my grandmothers speak only Amazigh. Currently, I live far away from them in an English-speaking environment, and that distance creates a terrifying question: What happens if there is a medical emergency at 2 AM and no one can bridge the language gap?In many Moroccan households, the primary caregivers might not speak English or French fluently, yet medical systems often demand it. I was inspired to create FamilyBridge to ensure that language is never the reason a child’s fever or an elderly parent’s stroke goes untreated. Whether it's a life-threatening emergency or simply checking a "bump on a finger" from thousands of miles away, this is for them.
What it doesFamilyBridge acts as an intelligent, multimodal "bridge" between a panicked caregiver and life-saving medical guidance.Multimodal Analysis: The user points their camera at the patient while describing symptoms in their native tongue (Amazigh, Arabic, or English).Dual-Track Triage: The AI uses the Pediatric Assessment Triangle for children and the FAST protocol for elderly stroke detection. It can also analyze localized issues, like skin infections or minor injuries.Bi-Lingual Output: It provides immediate, calming instructions in the caregiver’s language ("Go to the ER now" or "Monitor at home") while simultaneously generating a professional English Medical Summary for the doctor.
How we built itWe built FamilyBridge to be fast, reactive, and intelligent, because every second counts in an emergency.The Brain: Gemini 1.5 Pro via the Google Generative AI SDK. Its multimodal capabilities allow it to "see" clinical signs like chest retractions or skin cyanosis while "hearing" the nuances of Amazigh dialects.The Skeleton: we ensure the app gracefully handles the "thinking" time of the AI without freezing.The Logic: We implemented a Triage Urgency Score $U$ to categorize risks:$$U = \omega_{age} \cdot \sum (V_{visual} + A_{auditory})$$Where:$V_{visual}$ represents detected clinical signs (like intercostal retractions or facial drooping).$A_{auditory}$ represents the severity of reported pain or duration described by the user.$\omega_{age}$ is the weight multiplier adjusted for high-risk age groups (infants and the elderly)
.## Challenges we ran intoDialect Nuance: Amazigh is a rich language with various dialects (Tachelhit, Tamazight, etc.). We had to fine-tune our prompts to ensure the AI looked for intent and emotional distress rather than just literal translation.Low-Light Triage: Most emergencies happen at 2 AM in poor lighting. We had to implement logic that asks the user to turn on a flash if the visual data is too "noisy" to detect skin color changes.Context Switching: Training the model to switch between a "2 AM emergency" and a "remote check-in" (like a bump on a finger) required complex system instructions to ensure the tone of the AI remained appropriate.## Accomplishments that we're proud ofGiving Amazigh a Voice: Seeing a state-of-the-art AI understand a language that is often overlooked in global tech was incredibly moving.State Management Elegance: Using Riverpod to turn a complex, multi-step AI analysis into a simple, three-color triage UI (Red, Yellow, Green) that anyone can understand.Bridging the Distance: Successfully demoing the "bump on the finger" scenario proved that this app can help people like me—living far away—provide support to our elders from a distance.
What we learned We learned that multimodal AI is the future of accessibility. It isn't just about "chatting" with a bot; it's about giving the bot eyes and ears to see the world as we do. We also learned a great deal about pediatric and geriatric emergency protocols, realizing that the AI's job isn't to diagnose, but to triage—to be the bridge that gets the patient to a human doctor faster.## What's next for FamilyBridgeVitals Extraction: We want to implement rPPG (remote photoplethysmography) to estimate heart rate and oxygen levels just by looking at the skin through the camera.Offline Mode: Emergency situations often happen where internet is spotty. We aim to use smaller, on-device models for basic triage.Expanding the Bridge: Adding more indigenous and under-represented languages to ensure no grandmother, anywhere in the world, is left in the dark.
Built With
- canvasapi
- gemini
- getusermedia
- javascript
- mediadevices
- react
- typescript
- vite
Log in or sign up for Devpost to join the conversation.