Inspiration

Accessing healthcare quickly is still frustrating especially considering the long waits, unclear clinic availability, and no easy way to know where to go when symptoms appear. Our inspiration came from a simple question: what if healthcare guidance felt as immediate and intuitive as chatting with an assistant, while automatically guiding you to real nearby clinics?

Health AI was born from the idea of combining AI-driven symptom analysis with real-world location awareness, so users aren’t just told what to do, they’re shown where to go.

What it does

Health AI is an intelligent healthcare assistant that:

  • Lets users describe symptoms through a conversational AI interface
  • Uses location access to automatically find nearby walk-in clinics using Google Maps & Places
  • Interactively highlights and displays clinics as interactive cards and pins on a live map
  • Maintains conversation history and context across sessions
  • Uses Eleven Labs Text to Speech to automatically book appointments on the user's behalf.
  • The result is a seamless flow from symptom → recommendation → real-world action.

How we built it

Frontend: React + HTML + CSS + Tailwind

  • Custom UI for chat, clinic cards, and Google Maps integration
  • Real-time map panning, zooming, and marker highlighting
  • State-driven sync between clinic cards and map pins

Maps & Location:

  • Google Maps JavaScript API
  • Google Places API for nearby walk-in clinics
  • Geolocation API for user location consent

AI & Backend:

  • Utilizes Microsoft Foundry to fine tune models
  • Utilizes C#'s Semantic Kernel to allow base models to perform agentic actions on the backend
  • Uses Supabase/PostgreSQL + EF Core to store user information, such as conversations, messages, symptoms, and appointments.
  • Utilizes Twilio + Eleven Labs to provide real-life sounding calls that books appointments on the user's behalf
  • Uses Terraform + JetBrains CI/CD to significantly speed up development speeds

UX Details:

  • Smooth animations, dark mode, and cursor-based effects
  • Clinic websites accessible directly from the UI
  • Clear visual hierarchy between chat, clinics, and map

Challenges we ran into

Synchronizing map behavior with UI state

  • Ensuring the map didn’t constantly re-center or jitter when clinics were selected required careful control of map bounds and effects.
  • Handling inconsistent API responses
  • Some clinics didn’t have websites, open hours, or complete data, we had to design graceful fallbacks.

Avoiding excessive Google Places calls

  • We optimized searches so the Places API only runs once per user location.
  • Maintaining smooth UX across async data flows
  • Chat responses, clinic discovery, and map updates all had to feel instant and connected.

Accomplishments that we're proud of

  • Fully integrated AI + Maps + real clinic data in a short hackathon timeframe
  • Seamless interaction between clinic cards and map markers
  • Clean handling of real-world edge cases (no website, closed clinics, missing data)
  • A polished, usable product, not just a demo
  • Most importantly, we built something that feels practical, intuitive, and immediately useful.

What we learned

  • Designing with real APIs introduces complexity that mock data never shows
  • Maps UX requires restraint, fewer automatic movements often feel better
  • Clear state management is critical when multiple UI components depend on the same data
  • Healthcare tools need clarity and trust more than flashy features

What's next for Health AI

  • Appointment booking integration directly from clinics
  • More advanced symptom triage and urgency scoring
  • Graph Feature to visualize symptoms and their potential causes
  • Clinic filtering (wait time, insurance, specialty)
  • Secure user profiles with long-term health memory
  • Mobile-first experience for on-the-go use

Built With

Share this project:

Updates