Inspiration
Food insecurity affects 1 in 3 UW students, yet the campus food pantry still runs on spreadsheets and gut instinct. Volunteers spend more time counting cans than helping people, and clients have no way to know what's available — or whether it's safe for their dietary needs — before making the trip. We wanted to build the infrastructure layer that food pantries are missing: real-time inventory intelligence that works for everyone in the system, from the volunteer packing donations to the student quietly looking for a meal.
What it does
Provision is a full-stack mobile app that digitizes every touchpoint in a food pantry's operation across four user roles:
Packers scan incoming donations using a 4-photo CV pipeline — front label, ingredients, nutrition facts, and expiry date — and Claude Vision extracts the item name, category, allergens, nutrition insights, and expiry date automatically. Multiple items can be queued and scanned in parallel while the packer moves on to the next box.
Distributors use voice commands ("Remove 3 cans of black beans") or a search-and-select flow to record outflows in seconds. Whisper transcribes the audio, Claude Haiku parses the intent, and the app shows a confirmation screen before writing to the database.
Admins get an AI-powered insights dashboard where they can ask natural language questions — "What's expiring this week?" or "What came in since my last shift?" — and receive answers grounded in live inventory data via an agentic tool-use loop.
Clients browse available food by pantry location, filter by category, and ask the AI assistant questions like "What's high in protein?" or "Any gluten-free options?" — with nutrition concerns and highlights surfaced directly on each item card.
How we built it
- Mobile: React Native (Expo) with Expo Router for file-based navigation, Supabase for auth and the Postgres database
- CV pipeline: Python FastAPI service running Claude Vision on 4 focused photos per item, extracting structured JSON (name, category, allergy_info, nutrition_insights, expiry_date) in a single API call
- Voice pipeline: OpenAI Whisper for STT → Claude Haiku for intent parsing → structured action (remove / expire / query) returned to the mobile for user confirmation
- AI agents: Direct Anthropic API calls from the mobile client using a tool-use agentic loop — the admin summary agent and client food-finder agent each have purpose-built tools that query Supabase in real time
- Database: Supabase (Postgres) with a normalized schema — items, batches, movements, notifications — plus a JSONB nutrition_insights column with a GIN index for fast dietary queries
- Auth + RLS: Supabase Auth with role-based access control enforced at the database level across four roles: admin, packer, distributor, client
Challenges we ran into
The hardest bug was silent and invisible: our CV backend expected 4 photos but the mobile was only sending 3, so every scan silently dropped the nutrition label. nutrition_insights was always null in the database and we had no idea — there was no error, just missing data. We caught it by reading both sides of the API contract carefully.
We also hit a subtle Supabase schema issue where the is_active flag on batches (an admin-managed field for designating the outflow batch per item) was being misread as a "has stock" indicator. Filtering on it made the entire distributor search return zero results. The inventory view already handles the quantity > 0 guard — we just had to stop double-filtering.
Getting expo-av to install cleanly in our Expo SDK environment required falling back to --legacy-peer-deps, and wiring up background scan jobs (fire-and-forget while the packer photographs the next item) required careful state management to avoid race conditions in the job queue.
Accomplishments that we're proud of
Provision works end-to-end across all four roles in a single app. A volunteer can photograph a box of cereal and, within seconds, it appears in inventory — with allergens, calories, sodium content, and expiry date — visible to a client asking the AI "What's safe for someone with a nut allergy?" That full loop, from physical item to AI-readable structured data to client-facing insight, is something we're genuinely proud of.
We're also proud of the nutrition intelligence layer. Storing nutrition_insights as structured JSONB means the AI agent can reason about it — not just keyword match — and the client card UI can surface a red "High sodium" flag or a green "Good source of protein" badge with zero extra user effort.
What we learned
Agentic tool use is powerful but needs careful scoping. Giving the client agent access to the wrong tools (e.g., movement history) made it verbose and slow. Constraining it to query_inventory and query_nutrition only made responses faster, cheaper, and more relevant.
We also learned that role-aware UX matters as much as the AI features. The same underlying data looks completely different to a packer, a distributor, and a client — and getting those three experiences right required thinking about each user's mental model, not just the database schema.
What's next for Provision
- Upstream retail integration — connect to grocery store surplus APIs (Flashfood, Too Good To Go) so incoming donations can be auto-catalogued before they physically arrive
- Push notifications — alert admins when stock of a critical item falls below threshold or when batches are within 3 days of expiry
- Multi-pantry federation — share anonymized inventory data across a network of pantries so clients can be routed to the nearest location that actually has what they need
- Barcode fallback — for items with legible barcodes, resolve to the USDA FoodData Central API for instant nutrition data without needing a 4-photo scan
- White-label deployment — package Provision as a configurable platform that any food bank in the Feeding America network can deploy with their own branding and pantry locations
Built With
- anthropic
- claude
- cv
- expo.io
- fastapi
- javascript
- llm
- python
- react-native
- slowapi
- sql
- supabase
- swift
- ts
- whisper
Log in or sign up for Devpost to join the conversation.