Inspiration
AllergenScan AI was born from a personal challenge. After witnessing a friend struggle with severe food allergies during social gatherings, I realized how difficult it can be to quickly identify potential allergens in unfamiliar foods. Traditional methods like manually checking ingredients or asking servers don't always provide complete information. I envisioned a solution that would leverage AI to give people with food allergies more confidence and independence in their daily food choices.
What it does
AllergenScan AI transforms your smartphone into an intelligent allergen detection tool. Users simply take a photo of any food item, and our dual-stage AI system first identifies the food and then analyzes it for common allergens. Within seconds, users receive a comprehensive report of potential allergens present in their meal, helping them make informed dietary decisions instantly.
How we built it
We built AllergenScan as a full-stack mobile application using React Native and Expo for cross-platform compatibility. The frontend is developed with TypeScript for type safety and uses React Navigation for seamless screen transitions.
The intelligence core of the application consists of two key AI components:
- A computer vision system powered by Clarifai's food recognition models that accurately identifies food items from images
- A large language model integration that acts as our allergen database, providing comprehensive allergen information for the identified foods
The application uses a simplified authentication system with client-side credential validation for demonstration purposes. It also maintains a clean separation of concerns with dedicated services for API communication, authentication, and image processing.
Challenges we ran into
Managing dependencies and ensuring compatibility across the React Native ecosystem also proved challenging, especially with the latest Expo SDK. Resolving package conflicts required careful version management and occasional workarounds.
Accomplishments that we're proud of
We are particularly proud of creating an intelligent system that combines multiple AI technologies into a seamless, user-friendly experience. Also, the application's intuitive UI/UX design makes complex AI technology accessible to users of all technical backgrounds. From image capture to results display, the entire process feels natural and straightforward.
What we learned
This project taught us valuable lessons about integrating multiple AI services into a cohesive system. We also believe that we have improved our skills in React Native development, particularly around camera integration and image processing. Working with the latest Expo SDK provided insights into the evolving mobile development ecosystem.
Perhaps most importantly, we learned how technology can address real human challenges. Building AllergenScan reinforced our shared belief that AI can be harnessed to create practical solutions that improve everyday life.
What's next for AllergenScan AI
The future roadmap for AllergenScan includes several exciting enhancements:
- Implementing a calorie estimation feature to provide nutritional information alongside allergen details
- Expanding the food recognition capabilities to include regional and cultural dishes from around the world
- Adding personalized allergen profiles so users can customize detection for their specific allergies
- Developing an offline mode for basic functionality in areas with limited connectivity
- Creating a community feature where users can share verified allergen information about restaurant dishes
Log in or sign up for Devpost to join the conversation.