*"Does this food have coconuts?" * We all hate to be that one person. But if you live with dietary restrictions, you really don't have a choice, unless you want to wake up in a hospital bed. Well what if you travel to a country that doesn't speak your language?

 Our web app helps people with dietary restrictions and food allergies make safer food choices. Using machine learning, the system predicts what a dish might contain and provides helpful background information, all through a simple, user-friendly interface using Gemini API. 

 You just describe your dish and we give you possible matches for it, along with an allergen report. Additionally, you get to learn all about your dish! Like the history of the dish, the geography it's from and some Eco swaps for sustainability. Also if you decide to cook it yourself, a recipe is also available at just a simple click! Or if you are a tutorial-person and want to see a demonstration, then click the YouTube link to find a good video that shows you how to make the dish!

Inspiration

  1. Our idea was inspired by our teammate Sahana, who had to ask “Does this food have coconuts?” every time she got food at the QWER hackathon, highlighting the everyday challenge of navigating food restrictions.
  2. Our teammate Orion envisioned integrating recipes into a system that makes traditional culinary recipes available by description.

We added a dash of allergies and a pinch of culinary heritage and baked it for 24 hours to get AllerGuess.

Features

AllerGuess works as a webpage where users can input their allergens they want to search for and a description of the food in front of them, whether they know what the food is or not, allerguess works either way. AllerGuess's main feature is finding the percentage estimate of allergens in a described dish. Other features include a recipe finder, history and origin analyzer, finding alternates for ecoswap and allergens. It also shows pictures of the most likely dish and you can choose other likely dishes to compare. Their information and photos are also available!

How we built it

The best dishes require good taste and presentation, so we distributed the work between us as frontend and backend

  1. Split development into frontend and backend. This is the part where we had to determine the input and output so that we wouldn't step into each other's codes.
  2. Built a machine learning model using a sigmoid function and gradient descent to predict the likelihood of ingredients in detected foods using the training models Anthropic AI generated for us.
  3. Integrated Gemini API to provide information on the identified dish.
  4. Used Wikipedia images to display the pictures of the dish for better identification.
  5. Used VS Code to develop an HTML-based webpage, and connect the ML model to an easy-to-use interface.

Challenges we ran into

  1. Staying focused and getting into a productive coding flow (including thinking a lot about the idea for the project)
  2. Understanding how each other’s code worked and connecting outputs/inputs correctly. We solved this by clearly defining data flow before coding
  3. Finding training data for the model. We used AI tools to help generate diverse food-related training examples
  4. Image display. Google does not allow automated bots to get images from its database by searching. When we pivoted to image generation, the generated images were really not identifiable. They were also not accurate occasionally. As a final solution we decided to use Wikipedia's free images, as most dishes have a picture in Wikipedia.

Accomplishments that we're proud of

  1. Creating a more inclusive food experience for people with dietary restrictions.
  2. Completing our first hackathon project !!!! :)
  3. Building a working demo that matches our vision (and looks cute!)
  4. Having a lot of fun while learning.

What we learned

  1. How to integrate Gemini API using python.
  2. How to work effectively as a team and communicate clearly.
  3. Using mock data for front end.
  4. Increased familiarity with python.

What's next for Allerguess

  1. Improve AI accuracy by training on larger, more diverse global food datasets.
  2. Add image-based detection so users can upload or take photos of food for analysis.
  3. Create personalized allergen profiles to give tailored warnings based on each user’s dietary needs.
  4. Expand the cultural recipe database to account for regional ingredient variations.
  5. Enhance accessibility with voice input, audio output, visual risk indicators, and multilingual support.
  6. Optimize for mobile use, so it’s easier to use on the go.

Built With

Share this project:

Updates