Inspiration

The idea originated from a common but overlooked problem: standing in front of a fridge full of random ingredients and not knowing what to cook. This often leads to food waste—not due to scarcity, but due to a lack of decision-making support.

We approached this as a reasoning and automation problem rather than just a cooking issue. What if an AI system could not only understand available ingredients but also decide, plan, and optimize meals automatically?

At the same time, cooking is inherently social—people love sharing recipes, discovering new ideas, and learning from others. So we extended this vision beyond individual use to include a community-driven experience, where users can share meals, get feedback, and discover what others are creating.

Nutri Chef AI was built to transform everyday kitchen uncertainty into an intelligent, automated, and collaborative experience—turning leftovers into structured, healthy meals while reducing waste and effort.


What it does

Nutri Chef AI is an AI-powered meal planning and automation assistant that converts raw ingredients into complete, optimized meal plans. It operates in two modes:

  1. Vision Mode Users upload an image of their fridge or pantry. A Vision pipeline detects and extracts ingredients, which are then passed into an AI agent for reasoning and planning.

  2. Creative Mode Users input ingredients, preferences, or constraints (e.g., high-protein, low-calorie), and the system generates structured meal outputs.

Instead of simple generation, the system performs a multi-step automated workflow:

  • Identifies available ingredients
  • Generates structured recipes
  • Calculates detailed nutritional values
  • Detects missing ingredients
  • Produces an intelligent shopping list

In addition, the platform includes a community layer:

  • Users can share their generated or custom recipes
  • Others can like, explore, and get inspired
  • Users receive recommendations based on popular and relevant meals

This transforms the system from a single-user tool into a collaborative AI-powered cooking ecosystem.


How we built it

We designed the system as a high-throughput agentic pipeline that integrates multiple AI components into a unified workflow:

  • Orchestration (Agent Layer) Built using LangChain, enabling the system to dynamically decide which tools to invoke (recipe generation, nutrition calculation, ingredient detection, and recommendation logic).

  • Inference Engine Powered by Groq’s LPU infrastructure, running llama-3.3-70b-versatile for ultra-fast response times and near real-time interaction.

  • Vision Processing Integrated LogMeal API for ingredient detection and segmentation, applying a probability threshold: $$ P(\text{ingredient}) \ge 0.35 $$

  • Nutrition Intelligence Implemented a scaling mechanism to compute total nutritional values from USDA-standard data: $$ \text{TotalNutrient} = \sum_{i=1}^{n} \left( \frac{\text{grams}_i}{100} \times \text{nutrient_base}_i \right) $$

  • Community & Data Layer Built using PostgreSQL, allowing users to store, share, and interact with recipes. This enables features like likes, recommendations, and personalized cookbooks.

  • Backend & Deployment Developed with FastAPI, containerized using Docker, and deployed via Railway (backend) and Vercel (frontend), ensuring scalability and performance.


Challenges we ran into

One of the biggest technical challenges was ensuring reliable structured outputs from LLMs. Even advanced models sometimes return extra conversational text around JSON, which can break downstream systems.

To solve this, we implemented a robust parsing layer using regex-based extraction to isolate the highest-level JSON object, ensuring consistent and safe communication between backend and frontend.

Another challenge was integrating multiple components—vision, LLM reasoning, nutrition logic, and community features—into a smooth, low-latency pipeline. Optimizing inference speed using Groq and designing efficient orchestration were critical to achieving near real-time performance.

Balancing automation with user control was also important, ensuring the system remains powerful while still intuitive and user-friendly.


What we learned

This project reinforced that impactful AI systems are not just about models—they are about orchestrating intelligence into real-world workflows.

We learned how to:

  • Design agent-based systems that go beyond single LLM calls
  • Combine vision, language, and data systems into one pipeline
  • Build community-driven AI applications that scale beyond individual users
  • Optimize for latency, reliability, and user experience

Most importantly, we realized that the real value of AI lies in automating decisions and enhancing human creativity—not just generating outputs, but building systems that people can rely on in their daily lives.


Built With

Share this project:

Updates