Inspiration

Being the university students we are, our two favourite things are saving money and eating delicious food! So we did just that: we created a program which automatically scrapes nearby stores for discount flyers and then automatically suggests meals the user can cook using those deals.

We figured: with all the time spent studying, homework, projects and deadlines, we both want to eat well and spend less. Why not leverage technology to do the heavy-lifting of finding discounts and suggesting meals that fit them?


What it does

  • The application scans online flyers from local stores for coupons or discount offers.
  • It pulls those deals and saves them into a CSV file for later processing.
  • Based on the deals found, it suggests meals the user can cook using a selection of discounted items.
  • It thereby links “what’s on sale / what’s cheap” with “what meals can I make” — helping users both save money and eat well.
  • There is a frontend and a backend: a Python backend for scraping/storing the data, and a TypeScript/Node/React (or similar) frontend for the user interface.

How we built it

  • Scraper / backend: We wrote Python code (in src/backend/app.py) that fetches flyer data (e.g., from store websites) and transforms it into structured data.
  • CSV export: Deals are output to CSV so they can be reviewed, processed or used easily.
  • Frontend: The UI is built with TypeScript/Node, using the Vite framework and Tailwind CSS for styling.
  • ** Recipe Database**: Loaded a wide range of recipes and ingredients from the MealDB API, which includes images, steps and youtube videos.
  • Integration: The backend API serves data of deals/recipes using Flask; the frontend consumes it and displays suggestions to the user (e.g., “Here are the deals near you, here are meals you can cook with them”).
  • Development flow: The npm run project:start command orchestrates the whole process: Python backend install → Node frontend install → launch both concurrently.

Challenges we ran into

  • Data scraping reliability: Store flyers often change format, use image PDFs or dynamic content, which made scraping tricky. Parsing consistent structured data from different flyer sources required extra work.
  • Recipe-deal matching logic: Suggesting relevant meals based on arbitrary deals is non-trivial. We had to decide how many discounted ingredients to include, how to match to recipes, how many deals vs how many meals, etc.
  • Time constraints: Because this was built in a hackathon / limited time context, we had to scope features carefully, prioritise the core pipeline (scrape → store → suggest).
  • Frontend-backend coordination: Ensuring the backend data schema aligned with the frontend UI, dealing with CORS / cross-domain or dev environment issues.
  • User experience: Making sure the UI showed deals in a simple/easy to understand way, and making the meal suggestions meaningful rather than generic.
  • Database/CSV handling: Ensuring data integrity when exporting to CSV, handling duplicates, stale deals, expired discounts, etc.

Accomplishments that we’re proud of

  • We fully implemented the end-to-end pipeline: from scraping live deals, storing them, exporting them to CSV, and suggesting meals based on those deals.
  • The app uses a modern tech stack: Python backend for core logic, TypeScript/Node + Vite + Tailwind for UI.
  • We created a usable prototype under time pressure that demonstrates the idea clearly.
  • We built a database of meals/recipes (meals.db) enabling meaningful suggestions.
  • We validated the concept: users can open the UI, see deals, and see “what can I cook” suggestions — thereby delivering on the promise of saving money + eating well.

What we learned

  • We learned how to build a web scraper and the associated challenges (dynamic content, inconsistent formats, error handling).
  • We gained experience integrating a backend + frontend stack, including launching both, wiring APIs, managing dependencies.
  • We deepened our knowledge of UI frameworks (TypeScript, Vite, Tailwind) and how to present data in a user-friendly way.
  • We learned about feature prioritisation in a hackathon context: focusing on core MVP rather than perfect UX or all edge cases.
  • We saw how user value comes from combining two domains (discount-finding and recipe-suggestion) rather than doing each in isolation.

What’s next for Deal-to-Dish

  • Expand scraper coverage: Add more stores / flyers, support PDF flyer formats, image OCR for non-HTML flyer data.
  • Improve recipe matching: Develop more intelligent recommendation algorithms, allow users to input dietary restrictions, cuisine preferences, etc.
  • Real-time updates and notifications: Send alerts when a deal appears that matches a user’s favourite meals or ingredients.
  • Mobile friendly UI / responsive design: Improve frontend for mobile devices, maybe create a mobile app wrapper.
  • User accounts & personalization: Let users save favourite meals, set budgets, track savings over time.
  • Integration with shopping list / checkout: Automatically build a shopping list from chosen deal-meals, maybe integrate with store APIs for online ordering.
  • Analytics & savings tracking: Show users how much they saved, highlight best deals, trending items.
  • Deployment & production readiness: Move from prototype to a hosted service, handle error cases, data refresh, scaling.

Built With

Share this project:

Updates