Inspiration
Pierre, on our team, has a father that works as an ICU doctor. He also saw patients struggle a lot to overcome post-OP problems from his ER experience. He is really convinced, as well as the teams is, that those problems can be solved with tech.
What it does
- Ingest diagnosis, medical prescription and clinical notes (audio + image scans)
- Create a plan from this data. This plan is actionnable : user can accept suggestions, that are automatically realized (ex : book appointment, ...)
- Save a patient card, with every detail to share the same precise context across different medical specialist.
How we built it
π Ingest β Mistral OCR + audio transcription parse prescriptions, operation reports and clinical notes into structured medical data.
π§ Reason & Act β A Mistral agent cross-references the patient's full context (history, treatments, interactions), produces a prioritized recovery plan, and executes validated actions via function calling: booking labs, scheduling appointments, setting reminders. All sensitive data stays encrypted on-device β zero-knowledge by design.
π³ Share β A patient card is auto-generated and enriched after every consultation, giving any specialist the full picture in seconds.
Challenges we ran into
- OCR β Medical documents are messy (handwritten, scanned, mixed layouts). We chose Mistral OCR over Pixtral vision for a uniform pipeline β one API surface for both images and PDFs.
- Audio transcription β Mistral has no STT. We picked ElevenLabs
scribe_v2for its keyterm biasing β inject medical vocabulary to avoid mangling drug names and procedures. - Structured output β LLMs return loose JSON that breaks Pydantic validation. We used
instructorto get automatic schema enforcement and retry-on-failure out of the box. - Team parallelization β Four people + AI coding agents = merge chaos. Strict interface contracts (Pydantic schemas) from day one kept us aligned, so we could build different features with mocks.
Accomplishments that we're proud of
- End-to-end flow working before the deadline β not a mockup, it parses real documents and produces real actions.
- Grounded in a real problem. Pierre's father is an ICU doctor. We built something we believe should exist.
What we learned
- ElevenLabs TTS is shockingly good β voice check-ins feel human, not robotic.
- Simple products are the hardest to build β resisting feature creep was the real challenge.
What's next for AfterMed
Edge vault β Move to a fully on-device encrypted storage (SQLCipher) so patient data never touches a server. Key derived from the patient only, true zero-knowledge architecture.
Living patient card β A card that gets richer with every consultation: conditions, treatments, drug interactions, past procedures, and patient-generated questions β all auto-updated. Shareable with any specialist via QR code or secure link, so the next doctor gets the full picture in 30 seconds instead of 10 minutes of "remind me what happened."
Active follow-up β Voice check-ins via ElevenLabs at D+1, D+3, D-1 before appointments. The agent asks how you're feeling, flags anomalies, nudges overdue actions, and prepares you for your next visit.
Connected health data β Integrate wearable data (Thryve: sleep, activity, heart rate) to contextualize follow-up. Detect deviations early β e.g. unusual heart rate + post-op patient = proactive alert.
Built With
- elevenlabs
- mistral
- pydantic
- python
- react
- typescript
Log in or sign up for Devpost to join the conversation.