Inspiration
As a fresher, I noticed that many students and early-career professionals struggle with basic but important decisions such as which programming language to learn, which career path to choose, or how to plan the next few months.
Most online advice is scattered, opinionated, or generic. Instead of clarity, it often increases confusion. I wanted to build something simple that helps people think clearly and make reasoned decisions, not just get answers.
What the project does
AI Decision Explainer is a lightweight web application that converts vague, real-world questions into clear, structured decision guidance.
A user enters a problem in plain English.
The app responds with:
- Key factors involved in the decision
- A comparison of available options with pros and cons
- A clear recommendation
- Reasoning behind that recommendation
The focus is on explainability, not just output.
How I built it
The application is built using:
- Python for core logic
- Streamlit for the user interface
- Google Gemini API for reasoning and content generation
I designed a structured prompt that forces the model to:
- Identify decision factors
- Analyze trade-offs between options
- Provide a justified recommendation
This ensures consistent, explainable outputs instead of free-form answers.
What I learned
Through this project, I learned:
- How to integrate the Gemini API using the new Google GenAI SDK
- How model selection, rate limits, and retries affect real-world apps
- How prompt structure directly influences output quality
- How to design AI systems focused on decision support rather than conversation
I also gained experience handling API errors, model overloads, and building a stable fallback flow.
Challenges faced
The main challenges were:
- Navigating changing Gemini model availability and quotas
- Handling rate limits
Built With
- google-gemini-api
- googlegenaisdk
- python
- streamlit
Log in or sign up for Devpost to join the conversation.