Inspiration

We wanted to make an app that would help people better manage what they eat. While apps for managing diet do exist, they're often difficult to use and require a lot of effort on the user's part; therefore, we wanted to create an app that would greatly expedite the process.

What it does

The app first prompts the user to create an account and sign in. Then, the user is taken to a page where they can either take a picture or view their previous food history. If they choose to take a picture, the camera starts, the user takes a picture of their food, approves it, and the picture is sent back to the app. The app then analyzes the photo to determine what food it is and then fetches nutritional data about the food, including a determination of if that food is overall healthy or not. The user can input whether or not they like that food item and the data is stored in the food history. Later, the user can ask for food suggestions and the app will output something they like based on past responses.

How we built it

We built the app using Android Studio, Clarifai API, Firebase, Wolfram Alpha API, Microsoft Azure, and Amazon Lambda. Android Studio is used to design and program all the functions of the app. The Clarifai API is used to analyze the objects in the picture and return tags of what was in it. The tags are then checked against a list of foods. Once the food in the picture is determined, the food is searched in the Wolfram Alpha and nutritional details are returned to the app, but also the machine-learning engine powered by Microsoft Azure. The machine-learning engine then compares the nutrition facts of the food to those previously known food. Using this comparison, the engine determines whether or not the user's food is overall healthy or unhealthy. The user can rank the food and their response is sent to Firebase. Firebase is where the user's food history and their food preferences are stored to generate suggestions later. Finally, Amazon Lambda is used to link the app to Amazon Alexa. The user can ask Alexa for food suggestions and then she will return suggestions from the user's food history.

Challenges we ran into

First, we ran into difficulty integrating the Clarifai API with the Android app. The tags weren't returning, which is a crucial function for every aspect of the app. With some re-structuring of the code, the app started outputting the tags and we could focus on the other functions. Another challenge we ran into was reading the file with all of the foods in it. Eventually, the file was scrapped and the strings of all the food names were just put into an array.

Accomplishments that we're proud of

We're proud of the fact that we made an entire Android app that is functional! We're also proud of all the different elements that work together to give the user good results. The Clarifai API correctly identifies each food item, Wolfram grabs that food's nutrition info, Firebase saves the user's food taste, and Azure figures out if that food is healthy or not. All of this together creates an easy and helpful experience for the user!

What we learned

While we were developing the app, we learned a lot! We learned more about Android App development, how to implement the Clarifai API to identify objects in pictures, and how to build a machine-learning engine in Azure.

What's next for NutriCam

In the future, we hope to expand NutriCam's ability to identify food by adding more foods to the list. Additionally, we would like to give the user the ability to view their food history; the button is there, the functionality just needs to be added. While NutriCam already has a lot a functionality, there's room for more!

Share this project:
×

Updates