AyuSight AI | AI EHR | Making doctors type no more
Inspiration
It all started with my mom. For the last 10 years, she's been living with diabetes, and every 3-4 months, we're back at the diabetologist's office. I've watched her prescriptions evolve from handwritten notes to printed documents, but the core problem remained: managing a decade's worth of fragmented medical history was a constant struggle for both us and her doctors. This personal pain point led our team to create the Ayu App (https://www.ayuapp.com/), a mobile app to help patients consolidate their health records. We launched it and got over 250 downloads last 2 months, but we knew we had only solved one half of the equation. To truly "complete the loop," we had to understand the other side: the doctor's. We went to the Bombay Ophthalmology Association (BOA) conference, not as vendors, but as students. We spoke to over 70 ophthalmologists, listening to their frustrations with technology. What we heard was a consistent story of burnout from manual data entry and a deep-seated reluctance to learn yet another piece of software. This is where the patient app evolved into the complete AyuSight AI ecosystem(😂 just a funny name we generated).
What it does
AyuSight AI is a complete ecosystem designed to create a frictionless connection between patients and doctors, built around three core components:
- The Ayu App (For Patients): A mobile-first platform where users can upload and manage all their medical documents, creating a single, comprehensive health history they can easily share.
- The Ayu Doctor Dashboard (The Command Center): This is the doctor's single pane of glass. It's a clean, intuitive web dashboard where the doctor can log in and see their patient list. When a patient scans the doctor's unique QR code or is searched by phone number, their entire consolidated medical history from the Ayu App instantly appears.
- The AyuSight AI Agent (The Silent Worker): This is our core innovation. The agent is a lightweight application that runs invisibly in the background on the doctor's computer. The doctor never interacts with it directly. Its only job is to listen for commands from the dashboard and perform actions on the doctor's existing Electronic Health Record (EHR) software. The workflow is designed to be seamless. During a consultation, the entire audio is recorded for accurate record-keeping. The doctor views the patient's complete history on the Ayu Dashboard. If they need to update the record with new data—say, from a physical document—they can simply use their voice: "Ayu, these OCT readings are here. Please add them to my EHR." The dashboard processes this command, and the background agent instantly performs the clicks and typing in the old EHR. This triad of app, dashboard, and agent finally delivers on our mission: to make doctors type no more.
How we built it
Our journey started with a simple dashboard, but after realizing doctors wouldn't abandon their existing EHRs, we pivoted to an AI agent that works with their setup. We began experimenting with multimodal LLMs like Google's Gemini and Anthropic's Computer Use model. They could "see" the screen and perform tasks, but a single action took up to 120 seconds—far too slow. This led to our breakthrough: a hybrid deterministic-AI workflow.
Intial Computer use from Claude was not that great, when we tried the streamlit UI and demo. But the recent results for (Computer use OSWorld) benchmark - that has increased from 44.4 to 61.4. So we thought now is the time!
- Templatizing with Playwright: We built a Chrome extension to record a workflow (e.g., creating a patient) and turn it into a deterministic, high-speed Playwright script.
- AI as a Fallback: The AI agent first runs this optimized template. If the EHR's UI changes or an error occurs, the system automatically falls back to the Sonnet vision-based LLM, which analyzes the screen and intelligently completes the task.
- This powerful agent runs in the background, orchestrated by the Ayu Doctor Dashboard. The dashboard serves as the central command center, providing the clean web interface from which the doctor can view data and issue the voice commands that trigger the agent. Throughout this process, Claude Code has helped a lot to build this in a weekend ❤️
Challenges we ran into
- The Wall of Workflow: Doctors are fiercely loyal to their EHRs. We learned we couldn't replace their software; we had to augment it invisibly, which led to the dashboard-agent architecture.
- The AI Speed Bump: Our initial pure-AI agent was smart but impractically slow. The 120-second wait time forced us to innovate the hybrid model, cutting execution time by more than half and making the system viable for a real clinic.
- The Fragmented Data Puzzle: Patients don't always have their full history. By connecting our dashboard to the doctor's EHR via the agent, we can merge the patient-provided data with the doctor's own records, creating the most complete picture possible.
Accomplishments that we're proud of
- The 2x Speed Breakthrough: Our hybrid deterministic-AI model slashed task execution time from 120 seconds to under 60s. This optimization is the key to real-world clinical adoption.
- From Code to Clinic: We're incredibly proud of our on-the-ground research. So in our time, we talked to over 80 doctors and spent a full day at Jyoti Eye Care hospital in Gujarat to truly understand their day-to-day reality.
- A Complete, Functional Ecosystem: We want to and have built and connected all three parts of our vision: the live patient app, the intuitive doctor dashboard, and the intelligent background agent.
What we learned
- We Learned Ophthalmology: We're developers who learned to speak the language of medicine 😂😭. We understand diseases, can decipher prescriptions, and appreciate the nuances of a clinical workflow.
- The Art of Doctor-Developer Communication: We learned how to approach busy clinicians, respect their expertise, and ask the right questions to uncover their deepest pain points.
- The Power of Hybrid AI: We learned that the most elegant solution often isn't pure AI. It's about blending the raw intelligence of LLMs with the speed of deterministic automation, using the right tool for the right job.
What's next for AyuSight AI | AI EHR | Making doctors type no more
We're just getting started. Our immediate plan is to deploy the complete AyuSight AI ecosystem in a pilot program with 10-15 ophthalmologists to gather real-world feedback. Looking ahead, we have two key priorities:
- Leveraging Audio Speech to Text: With consultation audio being recorded, we plan to explore AI-powered transcription and summarization to automatically generate clinical notes, saving doctors even more time.
- Security and Privacy: We are aware of the sensitivity of patient data & PII data. A major focus will be on building a secure, ABHA & HIPAA-compliant architecture to ensure our entire system is robust and trustworthy.
Thank guys for reading this ; ) sorry it was a bit long
Built With
- claude
- javascript
- next
- python
- react-native
Log in or sign up for Devpost to join the conversation.