Inspiration The inspiration for PrediCore AI stems from the urgent need to address bias and inequities in the criminal justice system. Traditional tools, like the COMPAS algorithm, often perpetuate systemic biases, leading to unfair outcomes. We envisioned a platform that not only predicts recidivism but also integrates fairness measures to ensure ethical and equitable decision-making.
What it does PrediCore AI predicts three critical aspects of recidivism:
Likelihood of reoffense, Severity of potential crimes, and Time until recidivism occurs. The platform also includes fairness evaluation metrics to identify and mitigate systemic biases, ensuring its predictions are not only accurate but equitable. This allows stakeholders to make more informed, fair, and transparent decisions.
How we built it PrediCore AI was developed using:
Machine Learning Models: We utilized XGBoost and Random Forest for predictive accuracy, alongside adversarial neural networks to address biases. Backend and Data Processing: Flask was used for the backend, managing data flow between the models and the user interface. Frontend: A simple and accessible interface that displays prediction results clearly, allowing users to download reports and view data visualizations. Fairness Metrics: Custom algorithms to evaluate fairness across demographic groups, ensuring equalized outcomes. Challenges we ran into One major challenge was addressing biases in the dataset. Training adversarial models to maintain predictive accuracy while reducing bias required significant fine-tuning. Additionally, integrating fairness metrics into a user-friendly system proved complex, as balancing technical depth with accessibility was key.
Accomplishments that we're proud of We’re proud of creating a platform that goes beyond prediction to integrate fairness and transparency into criminal justice decision-making. Successfully implementing adversarial debiasing alongside a fully functional interface was a significant technical and ethical achievement.
What we learned Through this project, we deepened our understanding of:
The ethical implications of AI in sensitive areas like criminal justice. How to implement and evaluate fairness metrics in machine learning models. The importance of designing solutions that are both technically robust and socially impactful. What's next for PrediCore AI Looking ahead, we aim to:
Expand the platform to predict other outcomes, such as successful rehabilitation pathways. Refine adversarial models for greater accuracy and scalability. Collaborate with stakeholders in the criminal justice system to pilot PrediCore AI in real-world settings, ensuring it achieves its mission of fairness and equity.
Built With
- css
- flask
- html
- javascript
- keras
- python
- sci
- tensorflow

Log in or sign up for Devpost to join the conversation.