Inspiration
Every 11 minutes, an elderly patient is hospitalized for a preventable medication error. Every 90 seconds, an older adult is treated in an emergency room for a fall that a caregiver could have anticipated. Among adults over 65, adverse drug events account for nearly 700,000 ER visits annually in the United States alone and the majority happen not from negligence, but from the absence of any continuous monitoring between clinical appointments. The average elderly patient sees their doctor 4 times a year. That leaves 361 days where no one is actively watching. No one detects the three consecutive missed doses. No one notices the blood pressure creeping up over two weeks. No one connects the new symptom to the drug interaction that was always there in the chart. We built HealthCard because the gap between clinical visits is where preventable harm lives. We wanted to build the system that actively inhabits that silence one that watches the pattern, understands what it means, and closes the loop to a caregiver or doctor before a manageable problem becomes a crisis.
What it does
HealthCard is an end-to-end AI caregiving platform that monitors elderly patients at home across four dimensions simultaneously: -Medication tracking with computer vision. Point a phone camera at any pill bottle or tablet. Our QR/vision scanner identifies the medication, cross-references it against the patient's schedule, and logs it — no typing required. If a dose is missed, the system detects it and escalates.
-Voice-first symptom logging. Because elderly users shouldn't have to type. Say "I feel dizzy and my chest is tight" and HealthCard transcribes, parses, and runs AI risk analysis on those symptoms against your known conditions and current medications in real time.
-Proactive AI risk alerts. We don't just log data , we analyze patterns. When blood pressure readings trend upward over three days alongside a missed Lisinopril dose, HealthCard flags it to the care team before it becomes an ER visit.
-Emergency QR profile. Every patient gets a scannable QR code encoding their blood type, active medications, allergies, conditions, and emergency contact. A first responder scans it — no app, no login, no delay. The information is there in under 8 seconds.
-The full care network —> patient, family caregiver, and doctor —> is connected in real time through a shared dashboard with role-based access.
How we built it
-The frontend runs on Next.js 14 with the App Router and TypeScript, designed specifically for elderly readability — large tap targets, high contrast, and voice-first interaction patterns. We used the Web Speech API for real-time voice-to-text symptom logging directly in the browser.
-The backend is Node.js with Express, structured into clean modular layers: routes → controllers → services → models. This isn't boilerplate — we designed it so each health feature (medication, symptom, vitals, alerts) operates as an independent service that shares a common data contract, making the AI layer a drop-in extension rather than a bolted-on afterthought.
-Data lives in Supabase (PostgreSQL) with row-level security so only authorized users — patient, caregiver, or doctor — can access each record. We designed the schema to handle heterogeneous health data: medication logs with timestamps, symptom reports with severity scores, and vital readings, all queryable for pattern analysis.
-Medication identification uses html5-qrcode for camera capture with a custom identification layer that matches scanned labels against the patient's active prescription list. We built QR code generation for the emergency profile so the full medical record is embeddable in a single scannable code that works offline.
-AI analysis routes symptoms and medication patterns through OpenAI's API with a custom system prompt engineered for elderly patient safety: it helps understands drug interactions, flags combinations of symptoms that warrant escalation, and generates plain-English explanations a 78-year-old can actually read.
Challenges we ran into
The hardest technical challenge was building a unified data pipeline where three completely different input modalities : voice, camera, and manual entry, all produce the same normalized record in Supabase without any data inconsistency. The Web Speech API produces raw transcript strings; QR scanning produces label images; manual forms produce structured JSON. Getting all three to converge into a single patient record schema took significant backend design work. Real-time responsiveness between the Next.js frontend and the Express backend was harder than expected when the AI analysis layer is in the loop. We had to architect the symptom analysis as a non-blocking async call so the UI never freezes while waiting for the OpenAI response. Designing for elderly users forced us to make hard UI decisions. Every interaction had to work with large text, no hover states (touch-only), and voice as the primary input. That constraint made us better engineers.
Accomplishments that we're proud of
We are proud of building a fully integrated system that connects medication tracking, symptom reporting, and AI-driven risk analysis into one cohesive platform. Creating a voice-enabled and scan-based interface that remains simple enough for elderly users while still being technically meaningful was a major achievement. Most importantly, we successfully shifted the concept from a basic health tracker to a proactive caregiving assistant.
What we learned
1)How to design a full-stack, database-driven system where every user interaction (voice, scan, or manual input) is converted into structured, queryable health data.
2)Gained experience building a modular backend architecture (routes, controllers, services, models) that keeps logic scalable, maintainable, and easy to extend with new AI features.
3)How to integrate multimodal inputs like Web Speech API and QR/camera scanning into a unified pipeline that feeds directly into backend systems in real time.
4)We learned that real-world healthcare systems require more than just functionality, in general, they demand clean data design, reliability, and system-level thinking to transform raw inputs into meaningful, actionable insights.
What's next for HealthCard
1)Upgrade the backend with real-time streaming updates so patient data can be processed instantly instead of batch-style logging.
2)Enhance the database layer with stronger relationships and time-series tracking to better model long-term patient health trends.
3)Integrate more advanced AI services directly into the backend to enable predictive health alerts based on historical data patterns.
4)Scale the system into a multi-user caregiver dashboard with role-based access control and secure cloud database deployment for production readiness.
Built With
- canva
- css
- html
- javascript
- openai
- python
- react
- supabase

Log in or sign up for Devpost to join the conversation.