We say a juice bottle, saw the label and wondered what all the big words on it meant. But were too lazy to Google them one by one so we made this app to do it for us.

What it does

We created an App that lets the user simply take a picture of the drug facts on the back of their food and reads the ingredients and scrapes the internet for information about them. Which is then displayed in a user-friendly manner.

How we built it

For the backend, we created a Flask REST API, hosted on Microsoft Azure, connected to an Azure SQL database. For the frontend, we used React-Native and Google's Cloud Vision API to recognize and pull out text from images.

Challenges we ran into

Too many. It took us a long time to set up over environments for mobile development combined with the fact that we all have iPhones and only one of us has MacBooks. We also ran into a lot of problems getting connected and proper outputs from the Vision API.

Accomplishments that we are proud of

All of the ones above. We were able to create our first mobile app that promotes healthy habits and people can actually use this app to improve their health.

What we learned

We learned how to use Azure to setup and host our Flask API and database. How to use Google's Cloud Vision and its text recognition abilities. But most importantly we learned React-Native and how to make a mobile app for the first time.

What's next for NutriApp

A feature that we were unable to implement is a settings screen and a more personalized feel to the app. We plan on displaying logs of past scans a user has done and also analyzing the data we collect.

Link to Video:

Share this project: