Inspiration
Millions in the U.S. struggle in legal battles without adequate representation. This gap leaves many with limited chances to prove innocence or challenge bias. The project applies AI toward social impact—amplifying access to justice for the least privileged.
What it does
The system surfaces explicit and implicit claims of innocence and bias. Attorneys can instantly query transcripts for key facts and contradictions, accelerating triage and early-stage case evaluation. It transforms raw parole hearing transcripts into citation-backed, attorney-ready insights.
How we built it
A FastAPI backend orchestrates ingestion and background processing through Redis/RQ. Transcripts are parsed, structured, and indexed via a RAG pipeline combining LlamaIndex and OpenAI models. The frontend enables drag-and-drop uploads, real-time summaries, and chat-based review.
Challenges we ran into
Integrating the frontend and backend within limited time proved complex. Managing real-time processing and caching while maintaining accuracy required several iterations. Ensuring the citation links aligned perfectly with transcript line numbers was particularly demanding.
Accomplishments that we're proud of
Created a fully functional innocence detection and citation system within a single application. Achieved seamless document ingestion and AI-backed fact extraction. Delivered a tool that legal teams can realistically use to accelerate case triage.
What we learned
Early testing is essential to balance performance and accuracy. Deep familiarity with parole hearing structure greatly improves model prompting. The process revealed how procedural language encodes subtle innocence cues and bias patterns.
What's next for The Innocence Engine
Expand to assist innocence projects nationwide with scalable, outbound client identification. Incorporate fairness and bias detectors tailored to each state’s parole system. Ultimately, build a global platform for AI-supported post-conviction review.
Log in or sign up for Devpost to join the conversation.