Inspiration

2 billion people suffer from micronutrient deficiency. In states like Uttar Pradesh and Bihar, over 40% of children are stunted. The problem is not that food doesn't exist — it's that no one can see the deficiencies, and surplus food goes to waste while families go hungry 10km away.

We asked: what if a community health worker could scan any meal with their phone and instantly know what nutrients are missing? What if a farmer could post surplus food and an NGO could claim it in seconds? That question became NutriVision AI.


What it does

NutriVision AI is a 5-module progressive web app for community health workers, NGOs, farmers, and families in underserved communities:

  • AI Meal Scanner — Upload any meal photo. Gemini 2.0 Flash Vision analyzes the nutritional profile, flags deficiency risks (iron, zinc, Vitamin A), and suggests cheap locally available fortification foods with their Hindi names. Gemini 1.5 Flash serves as automatic fallback.

  • Surplus Food Dashboard — A live CRUD map where farmers, NGOs, and markets post available food surplus. Real OpenStreetMap markers, real Supabase data — fully functional, not mocked. Judges can add, reserve, and claim records live.

  • NutriLearn Flashcard Generator — Powered by Featherless.ai (DeepSeek-V3). Community health workers pick a topic (iron deficiency, MUAC screening, child stunting) and get AI-generated flashcards with South Asian food examples and 3D flip animations.

  • NutriBot — A WhatsApp-style nutrition chatbot (Gemini 1.5 Flash) that responds in English and Hindi with locally relevant, affordable food advice.

  • Risk Assessment — A 3-step malnutrition screening form. Assess any patient in under 2 minutes. Returns risk level, predicted deficiency types, and specific action recommendations.


How we built it

We built NutriVision in 48 hours using React 18 + Vite for the frontend, Supabase for the live database, and three AI services: Google Gemini 2.0 Flash for meal vision analysis, Gemini 1.5 Flash for the NutriBot chatbot, and Featherless.ai running DeepSeek-V3-0324 for the NutriLearn flashcard generator.

The AI meal scanner sends a base64-encoded image to Gemini's vision endpoint and parses structured JSON back — macronutrients, deficiency risks, and fortification suggestions with Hindi names and cost estimates. If Gemini 2.0 fails, the app silently retries with Gemini 1.5 so the demo never breaks.

The surplus map uses React Leaflet with OpenStreetMap tiles (no paid API key) and Supabase realtime so new markers appear instantly without a page refresh.

We built it as a PWA so it installs on Android and iOS and caches map tiles for offline use — critical for rural health workers with poor connectivity.


Challenges we ran into

Getting Gemini to consistently return clean JSON from image analysis was the hardest technical problem. We built a parseJSON() utility that strips markdown fences and finds the JSON object even when the model adds explanatory text before or after it. Without that, roughly 20% of scans would crash.

The dual-AI fallback system (2.0 → 1.5) required careful state management — the UI needs to show which model is running and switch the badge text automatically without the user noticing any interruption.

Building for real-world constraints was humbling. Hindi translation, offline PWA, and the "Try Example Meal" button (for users without a meal photo handy) were all late additions that turned out to be the most impactful UX decisions.


Accomplishments we're proud of

  • A meal scanner that returns structured, actionable nutrition data — not just "eat more vegetables"
  • A live surplus map with real seeded data from UP and Bihar that judges can interact with
  • Full English/Hindi toggle across every page including scan results
  • Featherless.ai DeepSeek-V3 integration producing high-quality educational flashcards with local South Asian food context
  • A PWA that installs on mobile and works offline — ready for communities with poor internet

What we learned

Building for communities with real constraints forces every feature to earn its place. The Hindi toggle took 2 hours and might reach more people than any other single feature. The "Try Example Meal" button was the last thing we added and the most important UX decision.

We learned that food redistribution is fundamentally an information problem. The surplus exists. The hunger exists. The missing piece is a shared map that both sides can see and update in real time.


What's next for NutriVision AI

  • 3-month pilot with village health units in Uttar Pradesh
  • WhatsApp bot integration (most CHWs already use WhatsApp)
  • SMS fallback for feature phones with no data
  • Bengali, Tamil, Telugu language support
  • Fine-tune a specialized South Asia nutrition model on collected scan data — the more meals scanned, the smarter the deficiency detection becomes

*Built in 48 hours. Open source. MIT

Built With

  • deepseek-v3
  • featherless.ai
  • framer-motion
  • google-gemini-1.5-flash
  • google-gemini-2.0-flash
  • javascript
  • openstreetmap
  • pwa
  • react
  • react-leaflet
  • supabase
  • tailwind-css
  • vite
  • zustand
Share this project:

Updates