Inspiration Cardiovascular diseases (CVDs) are the number one cause of death globally. While doctors do their best, predicting survival in heart failure patients involves analyzing multiple complex clinical features simultaneously. We realized that for AI to be truly useful in the medical field, it shouldn't just be a "black box" that spits out a prediction. Doctors need to know why the AI made that decision. That inspired us to build an Explainable AI (XAI) web application that not only predicts mortality risk but visually explains its reasoning.
What it does Our project is a fully interactive, live web application designed for medical professionals.
Interactive Inputs: Users can input 11 clinical features (like Age, Ejection Fraction, Serum Creatinine, and Creatinine Phosphokinase) using intuitive sliders and dropdowns.
Instant Risk Prediction: The app instantly calculates the probability of heart failure mortality (categorized as High Risk or Low Risk).
AI Explanation (The Wow Factor): Instead of just giving a number, the app generates a dynamic SHAP Waterfall Plot. This graph clearly shows the clinician exactly which patient factors increased the risk (red bars) and which factors decreased it (blue bars).
How we built it We built this project using a robust data science pipeline:
Data & Modeling: We used pandas for data manipulation and trained a GradientBoostingClassifier via scikit-learn. Gradient Boosting was chosen for its high accuracy on tabular data.
Explainable AI: We integrated the shap library to compute Shapley values, providing deep transparency into the model's decision-making process.
Frontend & Deployment: We transitioned our code from a static Google Colab notebook into a dynamic web interface using Streamlit. Finally, we linked our code via GitHub and deployed it on Streamlit Community Cloud to make it accessible to everyone.
Challenges we ran into Our biggest challenge was moving from a static Jupyter Notebook environment (Google Colab) to a live, interactive frontend. Initially, we tried hosting the app locally using localtunnel (loca.lt), but we faced severe JavaScript/CSS dynamic import errors that broke the UI. We quickly pivoted, learned how to set up a requirements.txt file, pushed our code to GitHub, and successfully deployed a stable version directly on Streamlit Cloud.
Accomplishments that we're proud of We are incredibly proud of bridging the gap between a data science experiment and a real-world clinical tool. Building the model was one thing, but successfully embedding a complex SHAP waterfall plot into a live, user-friendly web app was a huge win for us. We turned raw code into a tangible, live software product!
What we learned We learned that model accuracy isn't everything—interpretability matters just as much, especially in healthcare. We also leveled up our deployment skills, learning how to manage dependencies and deploy cloud applications seamlessly using GitHub and Streamlit.
What's next for CardioGuard: Explainable AI Predictor In the future, we aim to:
Integrate larger, multi-hospital datasets to make the model even more robust.
Add a feature to generate downloadable PDF reports for patient records.
Include more diverse clinical features (like ECG data) to improve predictive power.
Built With
- google-colab
- matplotlib
- pandas
- python
- scikit-learn
- shap
Log in or sign up for Devpost to join the conversation.