Moha Health
My own health assistant. AI-powered intake that actually listens.
Built for Hack Canada 2026.
Inspiration
In Canada, healthcare often begins with a long wait.
Patients may spend hours waiting before anyone even asks detailed questions about their symptoms. The intake process — the first step in care — is often rushed, even though it plays a critical role in identifying urgent cases.
We built Moha Health to explore a simple idea:
What if the first step of healthcare was handled by an AI system that carefully listens, asks the right questions, and routes patients to the right care faster?
Instead of static forms or basic chatbots, Moha Health simulates a coordinated healthcare intake team made of specialized AI agents.
It doesn't replace doctors.
It simply helps patients reach the right care faster and more clearly.
What it does
Moha Health is a multi-agent AI healthcare assistant that simulates the first stage of a hospital visit.
Users describe symptoms using text or voice, and the system guides them through a structured intake process.
The system:
- Collects symptoms through a conversational AI intake nurse
- Routes cases to the appropriate specialist AI (dermatology, dental, cardiology)
- Asks focused follow-up questions
- Generates a clinical-style triage report
Optional inputs can enhance the assessment:
- Symptom images
- A short face video to estimate heart rate and respiration
- A stored health profile for returning users
The result is a clear triage summary that can help identify urgency and guide the next step in care.
How it works
Moha Health uses a multi-agent AI architecture that mimics a hospital intake workflow.
The system pipeline can be summarized as:
$$ Patient \rightarrow Intake\ Nurse \rightarrow Router \rightarrow Specialist\ Agent \rightarrow Triage\ Engine \rightarrow Clinical\ Report $$
Each agent has a specific responsibility:
- Intake nurse agent gathers symptoms and missing information
- Router agent decides whether a specialist should be consulted
- Specialist agents ask focused follow-up questions
- Triage engine generates an urgency level and report
This modular design makes the system easier to extend with additional specialties in the future.
Key Features
Conversational Intake
An AI intake nurse collects symptom information such as location, severity, and duration.
Smart Routing
A routing agent determines whether a specialist consultation is needed.
Specialist Follow-up
Dermatology, dental, or cardiology agents ask targeted questions.
Triage + Clinical Report
The system generates a readable triage report and structured output.
Voice Interaction
Users can speak instead of typing using speech-to-text and text-to-speech.
Vitals from Video
A short face video can estimate heart rate and respiration.
Symptom Images
Users can attach a photo when describing visual symptoms.
Health Profile
Optional patient profiles allow returning users to store health history.
Tech Stack
Backend
- FastAPI (Python 3.11)
- Google Gemini via Backboard for LLM orchestration
- Presage for vitals estimation from video
- ElevenLabs for voice synthesis
- Cloudinary for media uploads
Frontend
- React
- TypeScript
- Vite
- Tailwind
- shadcn/ui
Deployment
- Railway for backend deployment
- Replit for frontend hosting
- Tailscale for secure networking when needed
Challenges we ran into
Designing multi-agent coordination
The biggest challenge was building a system where multiple AI agents could collaborate while maintaining a coherent conversation with the user.
Careful prompt design and structured context passing were required to ensure that specialists received the right information.
Balancing AI reasoning with reliable triage
Healthcare applications require predictable outcomes.
To maintain reliability, Moha Health combines:
- LLM reasoning
- rule-based triage logic
- structured report generation
This ensures results remain interpretable and consistent.
Integrating video vitals
Extracting heart rate and respiration from video required integrating external models and building a reliable media processing pipeline.
Handling uploads, processing, and responses smoothly during live interaction was a significant engineering challenge.
What we learned
Building Moha Health taught us several important lessons:
- Multi-agent AI systems can mirror real-world workflows effectively
- Healthcare AI must balance flexibility with safety and interpretability
- Voice, vision, and language models together create more natural interactions
- User experience is just as important as model capability
Most importantly, we learned that AI can assist healthcare workflows without trying to replace clinicians.
What's next for Moha Health
Future improvements could include:
- more medical specialties
- integration with clinical guidelines
- additional physiological signals
- multilingual support
- integration with healthcare systems
The long-term goal is to create a digital front door to healthcare that helps patients reach the right care faster.
Sponsors & Thanks
This project was built during Hack Canada 2026, part of the Major League Hacking 2026 season.
Special thanks to:
Google Gemini
Tailscale
Stan
Cloudinary
Backboard
ElevenLabs
GitHub
Author
Nafisat Ibrahim
Built With
- auth0
- backboard-api
- cloudinary
- docker
- elevenlabs
- fastapi
- git
- github
- google-gemini
- javascript
- presage
- pydantic
- python
- railway
- react
- replit
- shadcn/ui
- speech-to-text
- tailscale
- tailwind-css
- typescript
- uvicorn
- vite
Log in or sign up for Devpost to join the conversation.