Inspiration

Our inspiration for creating this app came from a very personal place. One of our group members has severe food allergies and was even hospitalized after unknowingly eating a dish that contained peanuts. As anyone with allergies will tell you, they have to constantly check what they’re eating. But when food is unlabeled or information isn’t clear, it’s all too easy to make a dangerous mistake.

We wanted to build something that could help prevent allergic reactions and give peace of mind to others in similar situations. That’s how we came up with FoodSafe, an app that empowers people with allergies to feel safe, informed, and in control.

But wait there's more, lets say you are at a restaurant and you're unsure if there are peanuts in your food, but the server/cook doesn't speak the same language as you. We have a built in interpreter that will ask that question for you! FoodSafe has you covered no matter where in the world you are!

What it does

  • Open FoodSafe and take a picture of what you are eating
  • In less than 5 seconds you'll get a popup telling you whether the food is safe to eat, might contain an allergen, or does contain an allergen. It will also show you what allergens it contains
  • If FoodSafe thinks that the food might contain the allergen, it pulls up a translate button in case you need to ask someone about the food who doesn't speak the same language as you.
  • The user can choose from over 50 languages to translate to, and when they hit translate FoodSafe uses text to speech to ask the person out loud if the food contains any allergens.

How we built it

  • We used react-native to build a responsive front end
  • With FAST API we set up an API system to enable efficient CRUD operations between the frontend and backend
  • After getting an image from the frontend we use gemini to identify the food and identify potential allergens
  • We send the results in JSON format through FAST API to the frontend to display to the user
  • If the user wants to translate, we use Google cloud translate API to create a question to ask in the specified language
  • We then take that translated question and put it through the Google cloud text to speech API to generate an audio recording in the specified language

Challenges we ran into

  • We had issues with being on a pubic wifi network, it messed with expo and we needed to use ngrok to tunnel our connection.
  • Spotty documentation of some gemini functions and google cloud methods
  • None of have used FAST API before, so it was also a little bit of a challenge to learn and incorporate that into our code.
  • We focused a lot on UI/UX design, but since none of us were super familiar with Figma we had to learn as we went.

Accomplishments that we're proud of

  • FoodSafe solves a very important problem and has the potential to save lives.
  • We were very fast with figuring out the different components of the project and settling on a tech stack

What we learned

  • In past hackathons our group has focused more on making a cool piece of software versus making something that solves a problem. This time we really thought about problems that we have in our lives and we can solve them.
  • We learned how to use FAST API to integrate front and back ends of an application
  • We've realized the importance of UX design, as with past hackathons it was kind of an afterthought when it should be one of the main focuses.

What's next for FoodSafe

  • We plan to remove the reliance on an internet connection for API calls, and make the application usable fully offline. This will expand the usability of FoodSafe so that it can keep you safe wherever you are!
  • We also plan to incorporate other image recognition and vision models to create a sort of model ensemble approach.

Built With

Share this project:

Updates