Inspiration

We really like food, but wouldn't want eating it coming at a risk. Being able to understand the food you're about to eat and whether or not you're allergic to it with just a simple picture is simple and fun.

What it does

The user enters their allergies into the app, which can be used to take pictures of food and pull out all the ingredients. It will be able to warn users if ingredients they are allergic to is in the food, while also telling the user information about the food (vegan, vegetarian, etc.)

How we built it

We used react native through expo.io to build the app, along with node.js. Detected food using google cloud vision API, and used python with the Edamam API to scrape all ingredients and info about the food.

Challenges we ran into

We ran into many challenges, such as sticking everything together and debugging the many errors we encountered working with react native.

Accomplishments that we're proud of

We managed to integrate different APIs together to create something unique and cool.

What we learned

Got better at react native and development of mobile apps.

What's next for AllerVision

Have a nicer GUI, and improve the analytics it provides on food.

Share this project:
×

Updates