Inspiration
Healthcare information is fragmented. People juggle multiple apps for symptoms, fitness, mental health, and doctor visits — none of which talk to each other. We were inspired by the question: what if a single, intelligent platform could be your entire health companion?
The rise of large language models, and specifically Google's Gemini API, gave us the confidence to build something that goes beyond static dashboards. We wanted AI to be a first-class citizen of the experience — not a chatbot bolted on the side, but a multi-agent system woven into every health workflow. The idea of Dr. Echo — an AI doctor who can triage emergencies, coach nutrition, guide fitness, and support mental wellness — became our north star.
We were also moved by the reality that many people lack immediate access to a doctor. A symptom checker, a medical report reader, or even a voice-enabled health assistant could make a meaningful difference to someone, somewhere. That urgency drove us to build fast and build broadly.
What it does
CareNexa is an all-in-one AI-powered healthcare platform with 15+ interconnected modules:
| Module | What it does |
|---|---|
| 🤖 Dr. Echo AI Assistant | Multi-agent conversational AI for general health, nutrition, fitness, mental wellness, and emergency triage |
| 📄 Medical Report Analysis | OCR-style extraction of health data from image and PDF uploads, with SHA-256 audit hashing |
| 📊 Health Dashboard | Personalised health score, quests, trend charts, and achievement receipts |
| ❤️ Heart Monitor | Real-time heart rate tracking and cardiovascular trend visualisation |
| 🩹 Symptom Checker | AI-guided symptom assessment and triage recommendations |
| 🏃 Fitness Tracker | Activity logging, goal setting, and progress analytics |
| 🧠 Mental Wellness | Mood journaling, guided exercises, and AI-powered mental health support |
| 🌸 Menstruation Tracker | Cycle logging, prediction, and wellness insights |
| 🗺️ Community Health Maps | Leaflet/OpenStreetMap-based community health pin board and safe-route assistance |
| 🩺 VR Doctor | Immersive, voice-enabled virtual doctor consultation experience |
| 👁️ Vision Module | AI-powered visual health analysis |
| 📚 Learning Center | Curated health education content |
| 📅 Doctor Appointments | Appointment scheduling and management |
| 🏥 Patient Report | Structured patient health record viewer |
| 🌐 Health Hub | Centralised health content and resource aggregator |
The AI audit trail uses SHA-256 hashing on every extracted medical report payload, defined as:
$$H = \text{SHA-256}(payload) = {0,1}^{256}$$
This ensures tamper-evidence for sensitive health data — a feature rarely seen in hackathon-grade health apps.
How we built it
CareNexa is a Next.js 13 (App Router) application written in TypeScript, deployed on Vercel.
Frontend architecture:
- React 18 with the Next.js App Router for server and client components
- Tailwind CSS + Radix UI primitives for an accessible, themeable design system
- Framer Motion for fluid page transitions and micro-interactions
- Zustand for persistent client-side state management
- Recharts, Nivo, and Chart.js for rich visual analytics across the dashboard and health modules
AI layer:
- Google Generative AI SDK (
@google/generative-ai) powering Dr. Echo - A custom multi-agent routing system via
POST /api/chat, dispatching to specialised agents based on conversation context (nutrition, fitness, mental health, emergency triage) POST /api/ocrfor medical report extraction, with SHA-256 integrity hashingPOST /api/health-insightsfor dynamic, AI-generated personalised health insight cards
Maps & community:
- Leaflet + react-leaflet for the interactive community health map
GET/POST /api/community-pinsfor crowd-sourced health location dataPOST /api/safe-routefor context-aware, AI-assisted route recommendations
Project structure was kept modular from day one — every feature lives in its own app/ route and components/ subdirectory, making parallel development possible:
CareNexa/
├─ app/ # 15+ Next.js routes
├─ components/ # ai-assistant, analysis, dashboard, health, maps, vr-doctor, ...
├─ hooks/ # custom React hooks
├─ lib/store/ # Zustand state slices
└─ types/ # shared TypeScript types
Challenges we ran into
1. Multi-agent routing without a framework
Building a multi-agent system from scratch — without LangChain or similar — meant designing our own intent detection and agent dispatch logic inside the /api/chat route. Getting agents to hand off context cleanly, without losing conversation history, was harder than expected.
2. Medical OCR from the browser Extracting structured health data from user-uploaded PDFs and images required careful prompt engineering of Gemini's multimodal capabilities. Ensuring the extraction was consistent enough to compute a reliable SHA-256 audit hash — and that hash matched across re-runs — took multiple iterations.
3. Leaflet SSR incompatibility
Leaflet and Next.js App Router do not get along by default. Leaflet accesses window on import, which breaks server-side rendering. We had to implement careful dynamic imports with ssr: false guards for all map components.
4. Scope management under a deadline With 15+ modules, the temptation to over-engineer each one was real. We had to make fast decisions about depth vs. breadth, shipping a working skeleton for every route while going deep on the AI assistant, dashboard, and OCR pipeline.
5. Voice integration across contexts Implementing voice-enabled experiences in both the AI assistant and the VR doctor — with different interaction models — required building a shared voice hook abstraction that could adapt to each context without duplicating the Web Speech API logic.
Accomplishments that we're proud of
- 🏆 15+ fully functional health modules in a single cohesive platform — all interconnected through a shared design system and state layer
- 🔐 SHA-256 audit hashing on medical report payloads — a production-grade security feature built into a hackathon project
- 🤖 Multi-agent Dr. Echo — a conversational AI that genuinely routes across health domains, not just a generic chatbot
- 🗺️ Live community health map with crowd-sourced pins and AI safe-route assistance
- 🎙️ Voice-enabled VR doctor experience — pushing the boundaries of what a web health app can feel like
- 📐 A clean, fully TypeScript codebase with zero
anyshortcuts and a well-structured component hierarchy - 🚀 Deployed and live at carenexa.vercel.app with zero downtime
What we learned
- Prompt engineering is an engineering discipline. Getting Gemini to reliably return structured, parseable health data — especially from noisy medical images — required systematic iteration, not guesswork.
- Multi-agent systems need contracts. Without well-defined input/output schemas between agents, context bleeds across conversations. TypeScript interfaces saved us here.
- Accessibility matters in health tech. Using Radix UI primitives early on meant keyboard navigation and ARIA compliance came largely for free — something we'd have missed if we'd built raw.
- State management at scale. Zustand's slice pattern proved invaluable for keeping 15+ modules' state isolated but composable.
- Speed and quality aren't always opposites. With the right abstractions (component library, shared hooks, typed API responses), we shipped fast and maintained code quality.
What's next for CareNexa
- 🔗 Real EHR/FHIR integration — connecting CareNexa to actual electronic health record systems via HL7 FHIR APIs for real patient data ingestion
- 📱 React Native mobile app — bringing CareNexa to iOS and Android with offline-first support and wearable device sync
- 🧬 Genomic health module — integrating consumer genomic data (e.g., 23andMe exports) into personalised health insights
- 🏥 Clinician portal — a separate dashboard for healthcare providers to review patient-submitted reports and AI triage summaries
- 🔒 End-to-end encryption — encrypting all health data at rest and in transit, moving toward HIPAA-grade compliance
- 🌍 Multilingual Dr. Echo — expanding the AI assistant to support 10+ languages for global health accessibility
- 📊 Longitudinal health scoring — tracking health scores over time with trend prediction using time-series models:
$$\hat{s}_{t+1} = \alpha \cdot s_t + (1 - \alpha) \cdot \bar{s} + \epsilon$$
where $s_t$ is the health score at time $t$, $\bar{s}$ is the rolling average, $\alpha$ is a smoothing factor, and $\epsilon$ captures AI-predicted deviation from baseline.
Built With
- chart.js
- css3
- eslint
- framer-motion
- google-gemini
- google-generative-ai
- html5
- javascript
- leaflet.js
- next.js
- nivo
- node.js
- npm
- openstreetmap
- postcss
- radix-ui
- react
- react-leaflet
- recharts
- rest-api
- sha-256
- shadcn-ui
- tailwind-css
- typescript
- vercel
- web-speech-api
- zustand
Log in or sign up for Devpost to join the conversation.