Inspiration

We’ve all been there: sitting in a waiting room, filling out endless forms, and getting blindsided by a big bill just to hear, “It’s the flu.” We thought, why does healthcare have to be this complicated?

That’s why we built Lotus, an intelligent triage AI that makes healthcare feel human again. No paperwork. No waiting rooms. Just a simple, natural conversation about how you’re feeling. Lotus listens, asks the right questions, and delivers a clean, complete clinical summary straight to your provider. You shouldn’t have to repeat your story five times to get care.

Lotus empowers doctors. Clinics get more accurate data, faster visits, and lower costs. Patients get clarity, shorter waits, and an experience that just works.

We’re building the world’s first universally accessible, always-on digital intake and triage assistant, one that treats the patient’s experience with the same care as the treatment itself.

Healthcare is complex. Getting care shouldn’t be. Lotus makes it simple.

What it does

Lotus AI is your virtual nurse. It is attentive, fast, and always ready. It listens to your voice, interprets your video feed, and understands your symptoms in real time. Lotus AI can also access your location, search the web, and even book show dialogs to book an appointment. Then, it helps you take the next best step.

Lotus remembers your symptoms, finds nearby hospitals, performs real-time web searches, schedules appointments, and even sends follow-up emails, all through a natural conversation, not a form.

How we built it

We combined NVIDIA NIM models for advanced reasoning and vision, ElevenLabs for natural speech transcription, and Selenium-driven web automation. Our FastAPI backend orchestrates the entire workflow, managing phases, context, and intelligent triggers for reasoning, web searches, location tracking, appointments, and follow-ups.

Lotus continuously reasons through every user message, maintaining context while keeping answers short, clear, and voice-friendly.

Challenges we faced

  1. Designing a workflow that could track symptoms, timing, and context with precision
  2. Teaching the AI to “think” in a natural, human-like way
  3. Seamlessly processing multimodal input, both voice and video
  4. Triggering smart web searches and hospital lookups at the perfect time
  5. Managing conversation flow so responses stayed concise and natural
  6. Ensuring booking and email actions remained optional and user-controlled

Accomplishments we’re proud of

  1. Built a fully functional multimodal nurse assistant from scratch, with live, real-time feedback
  2. Created a clear, tag-based reasoning framework for organized thought
  3. Integrated real-time web and hospital search
  4. Delivered consistent, empathetic, and voice-optimized responses
  5. Implemented a complete workflow including optional follow-up emails
  6. Designed a clean, intuitive UI that patients actually enjoy using

What we learned

We discovered how powerful structured reasoning becomes when fused with multimodal AI. We learned how to manage complex contexts, engineer prompts that feel natural, and balance intelligence with empathy. Most of all, we learned what it takes to make an AI assistant that feels genuinely safe, human, and helpful.

Built With

Share this project:

Updates