Inspiration

Food decisions happen constantly throughout the day, yet most of them occur subconsciously. Research suggests people make over 200 food-related decisions daily, but are consciously aware of only a small fraction of them. These decisions are often influenced by habits, convenience, and cravings rather than the body’s actual nutritional needs.

At the same time, emerging research in biosensing and wearable technology suggests that future devices could detect biochemical and physiological indicators related to metabolism, hunger hormones, hydration, and nutrient balance.

This sparked our idea: What if people could see their body’s nutritional needs in real time before choosing what to eat?

NutriSense explores this speculative future where technology reveals an invisible sensory layer of metabolism and nutrition, helping people make more informed food choices.

What it does

NutriSense is a speculative wearable ecosystem that helps users align their cravings with their body’s actual nutritional needs.

The system consists of:

  1. SenseLoop Neck Hub A lightweight collarbone wearable that monitors metabolic indicators such as hydration levels, nutrient deficiencies, hormonal hunger signals, and stress markers.

  2. AR Lenses Interface Augmented reality lenses visualize how compatible a food item is with the user’s current body state.

  3. Metabolic Intelligence Engine An AI system interprets sensor data, lifestyle patterns, and environmental context to generate personalized food recommendations.

Instead of restricting cravings, NutriSense helps users understand and navigate them, suggesting food options that satisfy taste while supporting metabolic balance.

How we built it

We approached NutriSense as a speculative design project grounded in human behavior and emerging technology.

Our process included:

Problem framing We explored how people make food decisions and identified decision fatigue and lack of nutritional awareness as key challenges.

Future technology exploration We studied current research on biosensing technologies such as glucose monitors, sweat analysis sensors, and neural interfaces to imagine how future systems could interpret metabolic signals.

Experience design We designed a wearable ecosystem combining:

a collarbone biosensor hub

AR visualizations

a layered information interface

User scenarios We created three everyday use cases—late-night cravings, grocery shopping, and nutrient-based meal recommendations—to demonstrate how NutriSense could support real decision-making moments.

Interface prototyping Using Figma, we designed AR overlays and a companion dashboard that visualize food compatibility, nutrient gaps, and meal suggestions in a simple and intuitive way.

Challenges we ran into

One of the biggest challenges was balancing speculative technology with believable user experience.

We had to avoid making the system feel like it was “reading thoughts” while still communicating how biochemical and neural patterns could detect cravings and metabolic needs.

Another challenge was preventing information overload. Nutrition data can easily become overwhelming, so we designed a layered interface that only surfaces deeper insights when the user focuses on them.

Finally, translating invisible body states into meaningful visual cues required careful thinking about what users should see and when.

Accomplishments that we're proud of

We’re proud that NutriSense:

Introduces the concept of metabolic perception as a new sensory experience.

Translates complex physiological data into simple, actionable food guidance.

Demonstrates how AR interfaces could support everyday decision-making.

Shows a complete ecosystem combining wearable hardware, AI interpretation, and AR visualization.

Most importantly, the concept reframes nutrition technology from tracking behavior after eating to guiding choices before eating.

What we learned

Through this project, we learned that many wellness technologies focus heavily on measurement, but fewer focus on decision-making moments.

Designing for these moments requires understanding human psychology, habits, and environments where decisions actually occur.

We also learned how speculative design can help explore future technologies while still addressing real-world behavioral challenges.

This project reinforced the importance of designing systems that augment human intuition rather than replace it.

What's next for NutriSense:From Invisible Nutrition Signals to Smart Choices

NutriSense opens the door to a future where people can understand their bodies in ways that were previously invisible.

Next steps for the concept include:

Expanding biosensing capabilities Future wearables could detect micronutrient levels, inflammation markers, and metabolic responses to specific foods.

Improving contextual intelligence The system could integrate environmental data such as nearby restaurants, grocery inventory, and meal preparation options.

Personalized nutrition ecosystems NutriSense could connect with food delivery services, smart kitchens, and grocery platforms to automatically recommend meals aligned with the user’s body state.

Ultimately, NutriSense imagines a future where nutrition becomes adaptive, personalized, and intuitive, helping people move from invisible body cues to smarter everyday food choices.

Built With

  • chatgpt
  • english
  • figma
  • figmamake
  • gemini
  • lovable
  • midjourney
  • nemo.ai
Share this project:

Updates