💡 Inspiration
Healthcare professionals often spend more time on paperwork than with patients. Writing SOAP notes and coding ICD-10 diagnoses is repetitive, time-consuming, and prone to error.
We wanted to build a tool that could lighten the documentation load so clinicians can focus on what matters most — patient care.
🏗️ What we built
MedAI Clerk is an AI-powered assistant that:
Converts unstructured patient notes into structured SOAP notes
Suggests ICD-10 codes with justifications
Runs in two modes:
API mode — lightweight, powered by OpenAI’s gpt-4o-mini
OSS mode — fully local, powered by GPT-OSS (20B open-weight model) for privacy and hackathon compliance
Supports quantized GPU inference (4-bit / 8-bit) for faster, smaller runs
Includes a doctor-friendly Next.js frontend with a clean UI
Fully Dockerized for one-command deployment
⚙️ How we built it
Frontend: Next.js (React), TailwindCSS, Framer Motion
Backend: FastAPI with dual LLM provider (LLM_MODE=api|oss)
LLM Integration: OpenAI API + Hugging Face Transformers (GPT-OSS 20B)
Quantization: BitsAndBytes for 4-bit/8-bit GPU efficiency
Deployment: Docker + Docker Compose
🚧 Challenges we faced
Integrating OSS models was tricky due to large size (~45 GB) and GPU memory constraints.
Setting up quantized inference required tuning Transformers + BitsAndBytes.
Keeping API and OSS backends aligned with a single codebase.
Designing a minimal, clinical-friendly UX for doctors.
📚 What we learned
How to serve large language models locally with Hugging Face + Docker.
The tradeoffs between cloud APIs (fast, lightweight) vs OSS models (private, heavy).
Practical use of quantization to shrink memory requirements.
The importance of usability in clinical AI tools.
🌍 Impact
By reducing documentation time, MedAI Clerk helps clinicians:
Spend more time with patients
Reduce administrative burden
Improve both care quality and job satisfaction
This shows how open-weight AI models like GPT-OSS can enable privacy-preserving, real-world healthcare applications.
Log in or sign up for Devpost to join the conversation.