Inspiration

Our inspiration were the real doctors - the protocol they use to first observe the patient visually, then ask about symptoms, and finally narrow down the diagnosis through thoughtful questioning. The same thing we have tried to implement in a website. As number of doctors are always less that patients, this would try to complete the supply demand gap.

What it does

Our system mirrors this approach: the image-based model acts as the doctor’s eyes, making an initial assessment, while the chatbot continues the conversation, asking questions and analyzing symptoms to refine the diagnosis.

How we built it

We used a pre-trained vision transformer for skin disease classification, integrated it with a Streamlit frontend, and connected a symptom-checking chatbot using OpenAI’s API for a conversational interface.

Challenges we ran into

We faced little challenges integrating multiple components smoothly and managing API keys securely without exposing them in the codebase.

Accomplishments that we're proud of

While we could not implement all the features some of which we also planned, we successfully designed and built several key components within a short timeframe. As newcomers to hackathons, this experience itself is a significant achievement for us.

What we learned

Through this project, we learned how to use pre-trained transformer-based vision models for targeted use cases like skin disease detection, build an interactive healthcare chatbot using large language models with streamed responses for more natural interaction and integrate both in Streamlit.

What's next for Skin Disease Detector & Health Chatbot

While the current version of our app demonstrates the potential of AI in healthcare, we aim to expand it further for:

  1. Real-time Consultation Matching
    Connect users to connected with telemedicine platforms for verified doctor consultations.
  2. Multilingual Chatbot
    Support regional languages for broader accessibility.

Built With

Share this project:

Updates