Being a vegetarian can sometimes be a struggle. Trips to the grocery store are oftentimes accompanied with reading ingredient labels to determine if we can eat the food. Even after reading labels for most of my life, my family and I sometimes miss over an ingredient that we cannot eat.

When I heard about Expo and how easily it interacted with the Google Cloud API, I realized that I could create an app during the course of Cal Hacks that not only fulfilled a need in my life, but also fulfilled a need in both my family's lives as well as other vegetarians.

What it does

The app tells the user to take a picture of the ingredient label. The app will look for ingredients that violate the user's diet specifications. If there are no ingredients that will prevent the user from eating a food item, Veggy will tell the user that the food meets his or her dietary needs.

How we built it

We built the app off of a Google Cloud API hackathon example on the Expo repo. From there, I extensively rewrote the program to analyze text rather than try to determine what an item was. My partner looked for hundreds of ingredients that were either not vegan or not vegetarian. After that, I modified the user interface and designed the logo for Veggy.

Challenges we ran into

We originally decided to build the app using Android Studio, given that Android apps could function easily with Microsoft's Natural Language APIs. However, we found that Microsoft's Natural Language APIs were horrible. Microsoft's NLP could not analyze simple, close up images of text. On the other hand, Google Cloud NLP APIs were much better at taking images and converting them to text. When my partner and I realized that it would be a lot easier to code the app in Expo and also have the benefit of making the app cross platform, we began writing the code in Expo instead.

Accomplishments that we're proud of

We are very proud of making an application that has real world applications. Veggy helps solve a problem that millions of vegetarians and vegans face every day. In addition, we are glad that we got to use technologies (Expo, React Native, Javascript) that neither of us have had much prior experience with.

What we learned

We learned how to use Expo for designing cross platform apps in an easy manner. In addition, we gained experience with React Native and Javascript. Finally, we found that for future analysis projects, Google's Cloud APIs would be the way to go.

What's next for Veggy

I would love to continue improving Veggy after Cal Hacks. There can still be some improvements in the user interface as well as how the code can be formatted. In addition, I am excited about expanding on the idea in the future through both AR and machine learning.

Built With

Share this project: