Inspiration

Patients in Tier 2 and Tier 3 cities who desperately need consultations with super-specialists (Neurologists, Cardiologists, Oncologists) face appointment wait times of 3-4 weeks, or are forced to travel hundreds of miles to metro healthcare hubs. We realized that raw conversational AI isn't enough to solve this. To cross the "Last Mile" of healthcare delivery, we needed an actionable, agentic workflow. We were inspired to build Duckteer: a system where AI doesn't just chat, but actively triages, prioritizes, and connects rural patients to world-class specialists using standardized Agent-to-Agent (A2A) collaboration.

What it does

Duckteer is an intelligent, A2A-driven telemedicine ecosystem. When a patient inputs their symptoms, our system triggers a collaborative agent workflow:

  • The Triage Superpower (MCP Tool): Powered by Google Gemini, it analyzes symptoms (e.g., "severe chest pain"), maps them to the exact required medical specialty (Cardiologist), and assigns a clinical Urgency Score.
  • The Booking Agent (A2A): Receives the clinical context and urgency via SHARP context propagation, queries doctor availability, and secures an appointment slot.
  • The Consultation: Provides a native WebRTC video interface where the doctor reviews an AI-generated clinical summary and the patient's FHIR-compliant medical history in real-time.

How we built it

To align with the Prompt Opinion platform and modern healthcare standards, we built Duckteer at the intersection of powerful UX and robust interoperability:

  • The Intelligence Layer (Option 2 - A2A & MCP): We developed an MCP server that acts as our "Symptom Extraction Superpower." This tool seamlessly feeds context into our A2A Orchestrator using SHARP headers, ensuring that when the Specialist Agent takes over, it already has the required FHIR data structures (Patient IDs, Observation logs) ready to go.
  • Backend: Node.js + Express with MongoDB, utilizing Socket.io to manage real-time WebRTC signaling for doctor-patient video consultations.
  • Frontend: A mobile-first Vite + React application with high-fidelity UI, focusing on accessibility and frictionless onboarding for users in non-metro regions.

Challenges we ran into

Shifting our architectural mindset from standard REST APIs to an Agent-to-Agent (A2A) communication model was our biggest hurdle. Ensuring that the MCP symptom-analysis tool could cleanly propagate patient context and FHIR tokens down the chain to the Booking Agent without losing clinical accuracy required careful schema design. Additionally, integrating a low-latency WebRTC video room directly alongside an AI-generated clinical dashboard pushed our full-stack capabilities to the limit.

Accomplishments that we're proud of

We are incredibly proud of proving that AI can transcend the "chatbot phase." By utilizing MCP and A2A standards, we successfully built an ecosystem where AI genuinely acts as a medical coordinator—accurately prioritizing high-urgency chest pains over mild headaches and routing verified FHIR data securely into a doctor's live workspace. We bridged the gap between a rural patient's phone and a super-specialist's screen.

What we learned

We learned the immense power of the Model Context Protocol (MCP) and how leveraging standardized FHIR resources allows different AI agents to collaborate without requiring brittle, custom "glue code." We also gained deep insights into managing real-time WebSocket states and WebRTC signaling for mission-critical telemedicine.

What's next for Duckteer

Our immediate next step is expanding our MCP toolkit to include automated lab-report parsing (converting uploaded PDFs directly into FHIR Observation resources). We also plan to integrate a multilingual voice-to-text agent, allowing rural patients to simply speak their symptoms in their native dialects to initiate the entire A2A triage workflow.

Built With

Share this project:

Updates