Inspiration

Dining out with allergies is stressful menus are vague, and ingredient lists aren’t always available. We wanted a pocket-sized solution that provides instant, trustworthy allergy insights from nothing more than a photo.

What it does

Loma lets you snap a picture of your meal and immediately highlights potential allergens: milk, egg, fish, shellfish, wheat/gluten, peanut, tree nut, soy, and sesame. Results include confidence scores and a clear, user-friendly verdict.

How we built it

We built an Android app in Android Studio that sends meal photos to a Python-based Cloud Vision model hosted on Firebase. The backend analyzes the image and returns likely allergens with probability scores, seamlessly integrated into the app interface.

Challenges we ran into

Our biggest hurdle was deploying the backend. This was our first time connecting an Android app to a server-side model and getting everything to talk to each other proved to be more difficult than expected.

Accomplishments that we're proud of

We’re proud that we created a functioning Android app from scratch, our very first one. Even without a fully working backend, we successfully designed and implemented the user-facing side of an idea we care about deeply.

What we learned

We learned that food image analysis is far from straightforward. Allergen detection is especially complex when ingredients are blended or hidden within dishes. On the technical side, we gained valuable experience in app development, backend deployment, and working with image recognition models.

What's next for Loma

Our next step is completing the backend integration so the app can deliver live results. Beyond that, we want to expand training data for better accuracy and add multi-allergen filtering to create more allergen transparency into dishes.

Built With

Share this project:

Updates