Category: Utility & Design
Problem Statement:
Millions of people with food allergies face the constant worry of whether a product is safe to eat, leading to stressful and time-consuming trips to the grocery store. They meticulously read tiny labels and often find themselves unsure, even after scanning ingredients multiple times. While barcode-scanning apps provide some relief, they still demand that you pick up each product and check, adding to the frustration. Example 1: Sarah, a mom with two kids who have severe nut allergies, spends hours at the grocery store, reading tiny labels and scanning products. Every trip feels overwhelming as she tries to ensure her family’s safety. Example 2: Alex, traveling in a foreign country, struggles to understand food labels in another language.
Target Users:
The target audience for this project includes:
Individuals with food allergies or dietary restrictions – People who need to avoid certain ingredients for health reasons, such as those with peanut, gluten, or dairy allergies.
Parents and caregivers – Particularly those shopping for children or loved ones with food allergies, who need to ensure the products they buy are safe.
Frequent travelers – Individuals who travel to foreign countries and face difficulties reading and understanding food labels in different languages.
Health-conscious consumers – Shoppers who want to be more informed about the products they buy, including those looking for organic, non-GMO, or specific dietary preferences.
Retailers and grocery stores – Businesses looking to enhance the shopping experience by offering a safer, more convenient option for customers with allergies or dietary needs.
Inspiration
This project was inspired by the everyday struggles faced by individuals with food allergies, and the desire to make their lives easier, safer, and more stress-free. We saw how much anxiety and time people spend deciphering labels, especially when shopping for themselves or loved ones with complex dietary needs. The value this project delivers is peace of mind—transforming grocery shopping into a seamless, personalized experience that removes the guesswork and constant vigilance. There’s a clear need for this solution because current methods, like manually reading labels or using barcode scanners, are time-consuming, unreliable, and often impractical in fast-paced environments or foreign countries. By providing real-time, customized product fit information according to the shoppers preferences, we can help people shop smarter, faster, and with greater confidence.
Description : What it does
This project is a mixed reality solution designed to make grocery shopping easier and safer for people with food allergies. spatial anchors, our system instantly displays personalized, real-time product information directly on the products placed in the stores, while browsing shelves, eliminating the need to manually check every product label. Through an AI voice assistant interface, users receive guided insights into ingredients, allergens details simply by holding or looking at a product in real time. Shoppers can pre prepare their shopping list, and set their preferences in the users preferences section. Once in the store, the shoppers with the mixed reality app , would be able to access their shopping list and the preferences they have set, the shopper can also modify the allergy preferences in realtime.
The MR app would show the filtered products according to the preferences, overlayed ( ✓ or × ) on the products on the shelves. This would also enable accessibility, with shoppers of different ages, and abilities. This app transforms this process by providing visual indicators directly in the user’s field of view, allowing for quick, hands-free decision-making and a more streamlined, secure shopping experience.
How we built it
With a lot of brainstorming and planning and discussions on the whiteboard: https://drive.google.com/file/d/1taXFnmbz-EIPld-42kGtRF0VOTyWx1r1/view?usp=sharing
Exploration of opensource object detection models, and self trained models: https://drive.google.com/drive/folders/1su5HfrAD9zGQ6aJhrodlS2BEkBJKkwdq?usp=sharing
Preparing the and exploring Dataset: https://drive.google.com/drive/folders/1OiaB3WQwnfoKVMkY6FhP3DQJMR-GnynU?usp=sharing
Trained Model for inference of requests to the roboflow platform : https://app.roboflow.com/grocery-items-y65ot/left-and-right-qjo6a/1
Exploring the usage of the AI kit provided by the organisers and integrating with the project with the help from Mentors ( Special thanks to Fabian ( XR Bootcamp) https://drive.google.com/drive/folders/1-CZZ9sUzGUCSNFh75nkxDZxcN33KFnRD
Unity project file : https://github.com/utsav695/MR-Grocery
Challenges we ran into
One of the initial challenges, was access to the quest device live camera stream, to be able to run the machine learning model in realtime and enhancing response from the machine learning models in realtime. We did manage to run the model on our local computers for object detection, however since the camera live stream api endpoints are not yet exposed, it wa a challenge, That we solved by making use of Spatial Anchors, placed at the physical store.
Accomplishments that we're proud of
We’re proud of how this project brought together mixed reality, AI, and design thinking to create a truly personalized, user-friendly shopping experience. From a technical standpoint, integrating real-time image detection with spatial anchors was a major achievement, allowing us to deliver instant, context-aware information directly in a user's view. We also applied system thinking to ensure the technology works seamlessly in a dynamic, real-world environment like a grocery store. What stands out most is how we designed an intuitive, hands-free interface that not only simplifies a complex task but also responds to the unique needs of each user. This project has shown us the power of thoughtful design in making advanced tech feel natural and accessible.
What we learned
Through this project, we learned the importance of balancing advanced technology with real human needs. Mixed reality isn't just about flashy visuals—it's about designing a system that feels intuitive and natural in everyday settings. We discovered that real-time accuracy and seamless interaction are key when dealing with something as critical as food allergies. It also reinforced how crucial it is to keep the user experience simple, ensuring that the tech enhances life without adding complexity. Most importantly, we saw firsthand how thoughtful design and system thinking can turn cutting-edge tech into something genuinely helpful and easy to use.
What's next for [S15] Coconut - MR Food Information
Next steps for the project will involve expanding its capabilities by incorporating more options and refining the overall experience. Support for a broader range of dietary preferences, such as vegan or low-sugar, will be integrated, along with enhancements to language translation for better accessibility in various regions. Efforts will be made to scale the system for use in additional stores, including larger supermarkets and local retailers, ensuring compatibility with different product databases. Consideration will also be given to applying the technology to other product categories, like skin care, products for pets and many more.
Real-World Impact:
Addresses a critical need for individuals and families dealing with food allergies, delivering convenience and safety through mixed reality. By offering personalized and intuitive product insights in real-time, this technology makes grocery shopping faster, safer, and less stressful. With the growing demand for tailored consumer experiences, this app bridges the gap between technology and daily life, offering practical benefits for millions of users.
Log in or sign up for Devpost to join the conversation.