Inspiration

Growing up in a small town in Himachal Pradesh, I saw how limited healthcare access can be. Doctors only visited once or twice a week, and a close friend’s fracture went undiagnosed for nearly two weeks because no doctor was available. This experience inspired me to build X-AI Care — not as a replacement for doctors, but as an assistive, explainable AI prototype that can highlight concerns, provide transparent insights, and improve communication between doctors and patients.


What it does

X-AI Care is an open, explainable diagnostic support system. It allows users to upload medical images (like X-rays), runs them through ML models, and generates:

  • Visual explainability (Grad-CAM heatmaps).
  • Feature attribution (SHAP values).
  • Counterfactuals (what-if scenarios).
  • Confidence scoring for predictions.
  • Doctor-friendly reports with technical insights.
  • Patient-friendly summaries simplified for understanding.
  • Interactive chatbot powered by GPT OSS for case discussion.
  • Dashboard + Database to manage cases, store reports, and track analytics.

How we built it

  • Trained prototype ML models for wrist fracture and cataract detection.
  • Integrated Grad-CAM, SHAP, and counterfactuals for explainability.
  • Used confidence scoring to show prediction reliability.
  • Implemented GPT OSS for automated report generation and chatbot interaction.
  • Built a database and dashboard for storing cases, visualizations, and reports.
  • Designed a dual report system: one for doctors (detailed, technical), one for patients (simplified, empathetic).

Challenges we ran into

  • Responsible framing: ensuring this is clearly an assistive tool and not a diagnostic substitute.
  • Integration complexity: combining ML models, explainability tools, GPT OSS, and dashboards into one cohesive system.
  • Balancing outputs: making technical details digestible while keeping patient summaries simple.
  • Resource limitations: simulating a scalable healthcare prototype within hackathon constraints.

Accomplishments that we're proud of

  • Created a full end-to-end pipeline: from image upload → explainability → GPT OSS reports → dashboard.
  • Combined Grad-CAM, SHAP, and counterfactuals for multi-layer explainability.
  • Built doctor + patient dual reports to bridge communication gaps.
  • Designed a dashboard + DB system that makes the project feel like a real prototype, not just a demo.
  • Translated a real-life personal problem into a tangible solution.

What we learned

  • How to integrate multiple explainability techniques into medical AI.
  • How GPT OSS can move beyond chat to power dynamic reporting and dialogue.
  • The value of human-centered AI design in sensitive domains like healthcare.
  • The importance of traceability and reproducibility in AI outputs through database storage.

What's next for X-AI Care - Open, explainable diagnostics

  • Expanding to more medical models beyond fractures and cataracts.
  • Adding bias and fairness checks to ensure reliability across diverse populations.
  • Improving the dashboard with advanced analytics (model performance trends, case clustering).
  • Exploring integration with telemedicine platforms as an assistive layer.
  • Refining reports further with multi-language support, so patients in rural areas can get insights in their local language.

Built With

Share this project:

Updates