Inspiration
During the hackathon, as I sat jotting down the foods I had consumed that day in my manual food log, the inconvenience of the process struck me. Each entry involved referring to packaging, guessing portion sizes, or looking up nutritional values online. This tedious, time-consuming method seemed archaic in our age of technology, and right there, amidst the creative energy of the hackathon, the idea for NutriScan AR was born.
What it does
Imagining a seamless integration of Augmented Reality and machine learning, I envisioned an app that could instantly recognize and overlay digital nutritional information on real-world food items. This wouldn't just be a tool but an experience, making logging interactive and intuitive. As the development of NutriScan AR progressed during the hackathon, it wasn't just about solving a personal inconvenience but about pioneering a new way to merge technology with health, making nutritional logging not just easier, but educational and engaging.
How we built it
The journey began during a hackathon, where I was logging my food and realized there had to be a more efficient way. Having no prior experience with Swift, I took a deep dive into its vast ecosystem. I decided to leverage Apple's Vision Framework to recognize objects through the camera. Paired with the CoreML model, the app was designed to identify various food items in real-time. CoreData became my choice for storing detailed nutritional information about recognized foods, allowing for easy logging with a single tap. The integration of AR (Augmented Reality) was the icing on the cake, providing an interactive overlay of nutritional data, enhancing the user experience.
Challenges we ran into
Navigating Swift as a newcomer was itself a significant challenge. Balancing real-time object detection performance without training due to time constraints was quite another challenge. Finally tweaking the AR overlay to ensure it was both user-friendly and informative was a meticulous process.
Accomplishments that we're proud of
Successfully melding AR with real-time food recognition was a monumental achievement for me. The fact that I could offer users instant nutritional feedback, negating the need for tedious manual entries, was immensely rewarding. Imagining how possible expansion was excited me to no end.
What we learned
This project was a quite learning curve. It underscored the significance of user experience in app development. While the backend processes, such as object detection, are undeniably vital, the end-user interaction truly defines an app's success. I ventured deep into AR and machine learning, understanding the nuances of optimizing performance without sacrificing accuracy.
What's next for Nutriscan AR
There's so much potential for NutriScan AR. I envision features like tracking water intake, setting specific dietary goals, and even syncing with health apps to provide a comprehensive health overview. Expanding the food database is also on the horizon, aiming to cater to users from diverse culinary backgrounds globally. I'm also believe that there is a possibility for collaborations with health experts to embed professional insights directly into the app, making NutriScan AR a one-stop solution for all things nutrition and health.
Built With
- swift
- visionos
- yolov8
Log in or sign up for Devpost to join the conversation.