Inspiration
Potluck, but for your fridge!
As students eating at UNT dining halls daily, we see massive food waste while millions face food insecurity and obesity. Existing apps only track what you've eaten (like MyFitnessPal or Cal AI), and nothing helps you make better choices in the present, especially since often times you don't have the luxury of pre-selecting your ingredients. So we decided to make something that would suggest healthy recipes with the items already in your fridge.
Particularly, we focused on accessibility.
- Confidence-based image recognition (so the user can ensure ingredients available is accurate)
- Super super accurate macro tracking. Apps like Cal AI is just feeding into ChatGPT -- equivalent to guesswork, but since we suggest recipes, the macro and calorie calculation is just summation of all the ingredients using government datasets.
- Reduced-motion, dyslexia fonts, color blindness modes, as well as a comprehensive tutorial/onboarding to teach you how to use the app!
What it does
Snap a photo of your fridge. AI identifies ingredients via image segmentation + Apple's image processing (VNClassifyImageRequest & VNRecognizeTextRequest) + OCR, suggests recipes matched to your dietary goals, and tracks nutrition progress against USDA data -- all fully offline.
How we built it
Vision framework (image classification + OCR) with a confidence router. SwiftUI + Swift Charts for UI. GRDB/SQLite bundle with 1,000+ validated recipes and USDA nutrition data. Python pipelines (Beautiful Soup, Pydantic) scraped and validated all bundled data offline. Claude/GPT-4 generated recipes against strict JSON schemas with multi-pass validation.
Challenges we ran into
No emulator camera access → built robust demo scenarios for onboarding
AI confidence calibration → defers to users on uncertain detections
Batch recipe generation hallucinations → caught by rigid Pydantic validation
Accessibility as architecture, not afterthought
Accomplishments that we're proud of
Fully offline! We literally built an image recognition, segmentation, great UI, all in a 6.7MB APP that runs on your phone!! (in like 24 hours) Accessibility baked in (reduced motion, dyslexia fonts, color blindness modes). Confidence-aware AI that asks instead of guessing.
What we learned
Constraining AI output ethically is essential. Using confidence-based routing and our "random-ahh python pipelines", we turned hallucination-prone models into reliable tools. Designing for accessibility first made the app better for everyone.
What's next for FridgeLuck
Partnering with UNT Dining for campus-scale waste reduction, community recipe sharing, adaptive recommendations from cooking history, and expanding to serve food-insecure families nationwide.


Log in or sign up for Devpost to join the conversation.