Inspiration When patients enter an emergency room or a telehealth queue, the triage process is often a bottleneck. Triage nurses are overwhelmed, and patients wait anxiously without knowing if their symptoms are critical. We saw an opportunity to offload the initial, structured intake data collection to AI. However, we also knew that traditional Large Language Models (LLMs) hallucinate diagnoses, which is incredibly dangerous in healthcare. We were inspired to build a system that combines the rigid safety of deterministic medical guidelines with the conversational intelligence of generative AI.
What it does TriageIQ is an AI-powered medical triage platform that bridges the gap between patient intake and clinical review.
For Patients, it provides a frictionless onboarding experience where they can upload unstructured medical reports or manually input symptoms and vitals.
For Clinicians, it provides a prioritized dashboard. Instead of just giving a magic urgency number, TriageIQ provides a transparent, step-by-step reasoning trace, a visual clinical knowledge graph showing why symptoms link to potential risks, and an independent "Confidence Calibration" score. If the AI is uncertain or vital data is missing, the system aggressively flags the patient with a "Needs Human Review" banner.
How we built it We architected a Hybrid Triage Engine that merges Graph RAG with LLMs:
Frontend: Built with React, Vite, and tailwind/shadcn for a polished, responsive UI. We utilized react-force-graph-2d to visualize the patient's data structure to doctors. Backend: Node.js, Express, and MongoDB Atlas. Intake/OCR Fallback: When a patient uploads a medical report image, we run tesseract.js locally on the server first. If it extracts enough text, we route that cheap text to Gemini 2.5. If the OCR fails, we route the image itself to Gemini Multimodal. This saves immense API costs and latency. The Graph Engine: We built a local Clinical Knowledge Graph containing medically curated symptom clusters and vital sign "red flags." LLM Integration: When determining triage urgency, the backend maps the patient's symptoms to the Graph, extracts the contraindications and differentials, and forces that strict deterministic context into the Google Gemini 2.5 prompt. Confidence Calibration: We wrote a custom algorithmic service that runs independent of the LLM to verify and score the AI's confidence, catching hallucinations before they reach the clinician dashboard. Challenges we ran into Building the Hybrid Triage Engine was our biggest hurdle. Initially, we just sent patient symptoms straight to Gemini, but it would confidently diagnose rare diseases based on minor symptoms. Building the deterministic Graph Engine to "ground" the LLM was difficult because matching user-inputted strings (e.g., "my chest hurts") to clinical nodes (e.g., "chest_pain") required building synonym mapping algorithms to bridge the gap between layman's terms and medical terminology.
We also struggled with CORS and deployment issues across Vercel and Render, ensuring that Firebase Auth could securely communicate between our live production environments.
Accomplishments that we're proud of We are incredibly proud of our OCR Fallback Architecture, which makes patient onboarding highly accessible without bankrupting our API usage.
More importantly, we are proud of the Confidence Calibration layer. It feels like a genuine breakthrough for us to intentionally discount the AI's own self-reported confidence when we mathematically know it is missing core vital signs, making the app much safer for real-world deployment.
What we learned We learned that LLMs should not be the source of truth in medical applications—they should be the reasoning engine layered on top of a deterministic source of truth. We also deepened our understanding of React state management, complex backend orchestration patterns (chaining multiple async services), and the intricacies of configuring production CORS policies.
What's next for TriageIQ: Clinical Decision Support WebSockets Migration: We want to move the Clinician Dashboard from HTTP polling to WebSockets (Socket.io) so that when a Level 1 Critical patient enters the queue, the clinician's screen flashes instantly. Vector Retrieval: We plan to transition our deterministic knowledge graph mapping to use Vector Embeddings (like MongoDB Atlas Vector Search) so that fuzzy semantic matching becomes much more accurate than simple string comparisons. EHR Interoperability: Integrating FHIR standard endpoints so our AI-generated triage summaries can connect seamlessly into hospital systems like Epic or Cerner.
Built With
- express.js
- firebase-authentication
- google-gemini-api
- mongodb-atlas
- mongoose
- node.js
- react
- shadcn/ui
- tailwind-css
- tesseract.js
- vite
Log in or sign up for Devpost to join the conversation.