Inspiration:- The inspiration came from a common "first-world problem": a fridge full of food but "nothing to eat." I noticed that food waste often happens because people don't know how to combine aging ingredients or feel intimidated by complex recipes. I wanted to create a tool that acts as both a sustainability hero and a culinary mentor, turning the chaos of a cluttered fridge into a confident cooking experience.

What it does:-AI Sous Chef is a multimodal cooking assistant. Users simply take a photo of their ingredients. Using Gemini’s Vision capabilities, the app identifies the items and assesses their freshness (e.g., spotting wilted greens). It then generates personalized, waste-reducing recipes based on user preferences (diet, skill level, time). During the cooking process, the user can upload photos of their progress to receive real-time visual feedback, ensuring their technique—like whisking eggs or searing a steak—is on the right track.

How we built it:-Core Engine: We used the Gemini 1.5 Flash model for its incredible speed and multimodal efficiency.

Vision & Reasoning: We leveraged Gemini’s ability to "reason" across images and text to identify ingredients and provide culinary substitutions. Frontend: Built with [mention your framework, e.g., Flutter/React Native] to provide a seamless mobile-first experience. Prompt Engineering: We designed specialized system instructions to ensure the AI maintained a helpful, "Chef-like" persona while prioritizing food safety and waste reduction.

Challenges we ran into:-The biggest challenge was Visual Noise. A crowded fridge has overlapping items and varying lighting. Initially, the model struggled to see things in the back. We solved this by implementing a "Refinement Loop," where Gemini asks the user to confirm identified items or take a second angle if it’s unsure. Another hurdle was calibrating the "Cooking Feedback" to be encouraging yet technically accurate without being overly critical.

Accomplishments that we're proud of:-We are incredibly proud of the Freshness Detection feature. It’s one thing to see an "apple," but it’s another for an AI to recognize "that apple is bruised, use it in a crumble today." We also successfully implemented a Real-time Substitution Engine that doesn't just swap ingredients but explains why (e.g., swapping butter for oil and explaining the smoke point difference).

What we learned:-We learned that multimodality is the future of UX. Users don't want to type a list of 20 ingredients; they want to show them. We also discovered how powerful Gemini’s long context window is—the app remembers your initial fridge scan even when you are twenty steps deep into a recipe, allowing for a truly "stateful" conversation without losing track of your pantry.

What's next for Ai sous chef:- Voice Integration: Implementing Gemini Live or a hands-free voice mode so users don't have to touch their phones with floury hands.

Grocery Integration: Automatically adding missing "gap" ingredients to a local delivery cart. Community Sharing: Allowing users to share their "Gemini-generated" culinary wins with a community of other "Waste-Zero" home chefs.

Share this project:

Updates