Inspiration

Because college students are broke, tired, and one overpriced burrito away from financial ruin—BudgetBites is here to keep you fed without sacrificing your rent money.

What it does

BudgetBites helps busy college students save money and time by turning their favorite meals into smart, budget-friendly grocery lists. Just tell the app what you love to eat, and it tailors weekly plans to your schedule, ingredients, and budget—so you can cook fast, eat well, and spend less.

How we built it

Input Parsing: We started by using the python-docx library to parse meal recipes and ingredient lists directly from .docx files, allowing users to input their existing meal plans easily. We also used it to parse a predefined list of available store items (simulating a Walmart inventory).

Local LLM Core: The "brain" of BudgetBites relies on a locally hosted Large Language Model (LLM), specifically gemma3:4b, accessed via the Ollama framework and its Python client (ollama-python). This allows for powerful natural language processing without relying entirely on external cloud APIs.

Core Logic & NLP Tasks:

The application first helps the user select meals using fuzzy matching (difflib) and number/name recognition.

It then makes the first call to the Ollama LLM, providing the details of the selected meals and prompting it to generate a consolidated grocery list, combining ingredient quantities.

A second, more complex call is made to the LLM. This prompt includes the generated grocery list and the parsed store inventory. The LLM is specifically instructed to compare the lists, find the best match for each grocery item in the store inventory, and provide an estimated price for each matched item, following a strict output format.

Price Extraction & Calculation: Regular expressions (re) are used to parse the formatted output from the second LLM call, specifically extracting the estimated dollar amounts for matched items. These are then summed up to provide a total estimated cost.

API Backend: To prepare for frontend integration, the entire logic was refactored into a backend API using FastAPI.

Pydantic models were defined to validate incoming requests (list of selected meal identifiers) and structure the outgoing JSON responses (shopping list text, total cost).

Specific API endpoints were created: /available-meals (GET) to list meals for the UI, and /generate-shopping-list (POST) to perform the main generation logic.

Error handling was implemented to return appropriate HTTP status codes and messages.

CORS middleware was configured to allow requests from a frontend application.

Development Environment: The final version is designed to run locally, requiring Python, necessary libraries, and a running Ollama server with the specified model.

Accomplishments that we're proud of

Successfully implementing the full pipeline from parsing user documents and selections, through multiple chained LLM calls for complex reasoning (consolidation, comparison, estimation), to final structured output with cost calculation.

What we learned

About Gemini and LLMs

What's next for BudgetBites

Create a mobile app

Share this project:

Updates