Inspiration
Alzheimer’s disease often develops silently for years before symptoms become clinically visible. Early intervention can significantly slow progression, but existing diagnostic tools are either expensive, invasive, or lack transparency. This motivated me to build an interpretable AI system that can assist early risk identification using accessible clinical data while remaining trustworthy for real-world clinical use.
What it does
This project predicts early Alzheimer’s disease risk and progression using machine learning models trained on clinical and cognitive assessment data. Unlike black-box approaches, the system uses SHAP-based explanations to clearly show how individual features contribute to predictions, enabling transparent and interpretable decision support.
How I built it
I developed a full machine learning pipeline in Python using Google Colab. The workflow includes data preprocessing, median imputation, robust feature scaling, model training with Logistic Regression and Random Forest, and performance evaluation using accuracy, ROC-AUC, and confusion matrices. SHAP explainability is integrated to interpret feature-level contributions.
Challenges I ran into
Clinical datasets contain missing values, outliers, and class imbalance. Ensuring numerical stability while maintaining interpretability required careful preprocessing and model selection. Handling infinite values and making the notebook fully reproducible were also key challenges.
Accomplishments that I'm proud of
I successfully built a reproducible, interpretable AI pipeline that balances predictive performance with clinical transparency. The integration of SHAP explanations makes the model suitable for real-world healthcare decision support.
What I learned
This project strengthened my understanding of explainable AI, clinical data preprocessing, and the importance of interpretability in healthcare-focused machine learning systems.
What's next
Future improvements include validating the model on external datasets, adding longitudinal progression modeling, and integrating a clinician-friendly dashboard for real-time interpretation.
Built With
- google-colab
- numpy
- pandas
- python
- scikit-learn
- shap
Log in or sign up for Devpost to join the conversation.