Problem Statement đź§©
Most popular diet and calorie tracking apps focus primarily on total calories and basic macronutrients, often ignoring the detailed breakdown of vitamins, minerals, and other critical nutrients that influence long-term health. Many users end up “hitting their calories” but still consume unbalanced diets that are low in essential micronutrients such as iron, B vitamins, and key minerals, which are linked to fatigue, poor immunity, and other health risks.
Current food logging workflows are tedious: people must search large databases, manually enter items, or scan barcodes, which discourages consistent use and leads to incomplete logs, especially for quick snacks or packaged foods eaten on the go. Even where advanced apps exist, they tend to either emphasize calories/macros or micronutrients, but rarely combine effortless logging with deep nutrient insights in a single, intuitive experience.
A second, serious gap is safety: ingredient lists on packages are long, technical, and full of codes like emulsifiers, preservatives, and food additives that many consumers do not understand. People with allergies or intolerances must spend time scanning every tiny line of text to avoid allergens or specific additives, and even dedicated allergy apps are often limited to barcodes or static databases that may be incomplete or outdated.
Moreover, snacking behaviour is usually under-tracked: most users forget to log “small” snacks such as chips, biscuits, traditional local snacks, or late-night treats, even though these often contribute a high share of daily calories, fats, and sugars. Existing apps are optimized around structured meals (breakfast, lunch, dinner) and do not actively help users notice snacking patterns or offer healthier alternatives in context.
In summary, there is a need for an app that goes beyond calorie counting to offer holistic nutrient tracking, automatic detection of allergens and risky additives, and intelligent monitoring of snacking behaviour—with minimal manual input from the user.
Proposed Solution đź’ˇ
NutriSense AI is a mobile app that uses optical character recognition (OCR) and AI to transform any packaged-food label or snack into a rich, structured nutrition log—including calories, macronutrients, vitamins, minerals, allergens, and selected food additives—while also learning a user’s snacking habits over time. The goal is to make holistic diet awareness as simple as taking a photo, while using AI to surface meaningful insights that are often missed in traditional calorie-centric tracking.
1. Holistic Nutrient Tracking (Beyond Calories)
Instead of tracking only calories and basic macros, NutriSense AI stores and visualizes a full nutrient profile for each logged item, including proteins, carbohydrates, fats, fibre, key vitamins (e.g., A, B-complex, C, D), and minerals (e.g., iron, calcium, magnesium). Drawing inspiration from advanced nutrition apps that already go deeper into micronutrients, the app generalizes this capability and makes it automatic through OCR-driven label extraction.
Users can see daily and weekly trends, such as “low in iron this week” or “very high in saturated fat from snacks,” rather than only a calorie bar.
2. OCR-Based Food Label Scanning
NutriSense AI allows users to simply point their camera at any packaged-food label; an OCR and AI pipeline extracts:
- The full nutrition facts table (calories, macros, vitamins, minerals).
- The ingredient list, including additives and codes such as emulsifiers and INS numbers.
By leveraging AI-enhanced OCR similar to research systems that identify nutrients and allergens from labels in real time, the app converts raw text into structured data mapped to known nutrients and ingredient categories. This reduces manual data entry and makes logging packaged foods as quick as taking one photo.
3. Portion Estimation and Intake Calculation
After scanning a label, the user can quickly input or adjust the approximate portion consumed—for example, “½ packet,” “one serving,” or “30 g.” The app automatically scales all nutrients from the per-100 g or per-serving values on the label to estimate the user’s actual intake for that snack or meal.
Over time, NutriSense AI can learn typical portion sizes for that user and suggest defaults (e.g., “You usually eat ~60 g of this snack”) to reduce friction.
4. Allergen and Additive Safety Layer
Using the extracted ingredient list, NutriSense AI cross-checks ingredients against customizable user profiles that specify:
- Allergies.
- Intolerances (e.g., lactose, gluten).
- Religious or ethical constraints.
- Personal “avoid lists” (e.g., certain emulsifiers or INS codes like INS 360).
Inspired by existing allergen scanners but extending beyond barcodes, the system uses OCR to scan any visible ingredient list and flags potential allergens or substances the user wants to avoid, even if the user is travelling or reading labels in another language.
The app provides a simple risk indicator (e.g., green / yellow / red) and short explanations so users understand why something is flagged.
5. AI-Powered Snacking Insights
NutriSense AI treats snacks as a first-class citizen rather than an afterthought, analyzing when and what users snack on and how those choices impact overall nutrient balance. The app clusters repeated “snacking episodes” (e.g., late-night chips, afternoon biscuits, sugary beverages) and generates insights such as “30% of your weekly saturated fat comes from late-night snacks.”
It then recommends healthier alternatives based on the user’s preferences and availability (e.g., nuts instead of chips, yoghurt instead of sugary desserts) and nudges the user with gentle, contextual prompts.
6. High-Level User Flow
- Onboarding: User sets goals (e.g., “improve protein intake,” “reduce sugar”), allergies, and additives to avoid.
- Log Food via Camera: User scans a food label or selects a previously logged item.
- OCR + AI Processing: App extracts nutrients and ingredients, identifies allergens/additives, and estimates portion intake.
- Feedback Screen: User sees calories, full nutrient breakdown, allergen/additive warnings, and “snack impact” on the day.
- Insights & Nudges: Over time, app shows trends (e.g., “low in Vitamin D,” “high sugar from evening snacks”) and suggests alternatives.
Innovation Over Current Methods ⚙️
1. From Calories to Full-Stack Nutrition
Many mainstream apps focus mainly on calories and macronutrients, requiring extra effort or premium upgrades to see detailed micronutrients. NutriSense AI makes micronutrient tracking central and automatic by reading labels directly, without requiring the user to manually search complex databases.
This turns “calorie counting” into a more complete view of diet quality, not just quantity.
2. OCR + Allergen/Additive Intelligence in One App
There are apps that scan barcodes to check for allergens or basic nutrition, and others that use AI or OCR to read labels, but these are often narrow in scope (only allergies, or only calories, or only database-based scanning). NutriSense AI fuses OCR, allergen detection, additive flagging, and full nutrient analysis into a single workflow, using image-based label reading rather than relying purely on barcodes or static entries.
This enables real-time analysis even for niche, local, or unpackaged products with visible labels.
3. Intelligent Focus on Snacking Behaviour
Most diet platforms are structured around meals and treat snacks as optional extra entries, meaning many snacks never get logged and their impact is invisible. NutriSense AI explicitly models snacking patterns, surfaces their nutritional impact, and uses AI-driven pattern recognition to recommend better alternatives, which is rarely emphasized in current consumer apps.
This shift acknowledges that “small” snacks often make the biggest difference in real-world diets.
4. Personalizable “Avoid List” for Additives and INS Codes
While some apps flag common allergens, very few allow users to define custom avoid lists for specific additives, emulsifiers, or INS codes that are personally concerning. NutriSense AI lets users add items like “avoid emulsifier INS 360” or specific categories (e.g., “avoid artificial sweeteners”), and the OCR pipeline learns to flag these from the raw ingredient text.
This gives users fine-grained control over their diet beyond standard allergen categories.
Built With
- brain
- docs
Log in or sign up for Devpost to join the conversation.