Eating healthy is one of the most difficult aspects of weight loss and muscle gain. There are many apps that help a user see what is in the food he or she is eating, but they inconvenient and are therefore not widely used. We were amazed by Clarafai's image recognition software and its ability to recognize various food items. We are determined to develop an app to help users eat healthier in a convenient and quick way.
What it does
The app uses Clarafai's image recognition API and their food model to detect food items. This information is then compared to a food database to find the calorie, sugar and fat content of the particular food item. The app then displays the user's total calorie, sugar and fat intake per day in a graph, allowing the user to keep track of their nutrition over a long period of time.
How we built it
After exploring Clarafai's groundbreaking API, we knew we wanted to apply it to the health field. First, the team found a database that included over 2,000 food items and their corresponding caloric amount, added sugars, and saturated fats. By reading this CSV file and adding the foods and their nutrition facts to a dictionary in Python, we had a comprehensive dataset to draw values from. Once a user takes a picture of their food, we then compare the outputs from Clarafai's prediction API to our dataset. The user simply has to select her food from a list of outputs ranked from strongest match to the picture to weakest match. Once selected, we gather the nutrition information for each food from the dictionary and display the values. Finally, these values are stored in a separate text file, and a user has the option to view a graph of their consumption over the last few days. Using matplotlib, we generate a scatter plot of the calories eaten in the last few days, which can help a user determine if they are over or under their recommended calorie intake. After the basis of the program was completed, we improved the user interface using the tkinter package which displays popups asking the user for input instead of forcing the user to type into the Python console to confirm foods.
Challenges we ran into
Perhaps the biggest limitation in our program is the dataset containing all the different foods and their nutrition facts. While this dataset is comprehensive, it doesn't have all the different types of foods and some foods are overly specific. For example, there is no choice for "slice of cheese;" instead there is only a "cheese" option which has a portion greater than just a single slice of cheese. Another challenge related to our specific food dataset was comparing the outputs of Clarafai's prediction API to the foods we had in our data. Clarafai is much more comprehensive while our data set only has a few foods, so we weren't able to find calorie amounts for every food returned by Clarafai's API.
Accomplishments that we're proud of
The team was able to produce a working demo in time for HackDuke 2018: Code for Good.
What we learned
The team learned to analyze data through Python, and how to improve UI for a Python project using tkinter. In addition, the team is excited to create an app over the next few weeks using this code, by writing the program in Swift to make it compatible with iOS. As a result, we learned Swift and were even able to start the foundations of an app that has access to an iPhone camera and can directly take a picture of food. Finally, the team learned the importance of delegating responsibilities and working as a team. This project required a few different modules, with numerous functions. Each person had to write code that was transferable and that returned a specific value used in the next function.
What's next for NutriFact
The team has begun production of an iOS app that utilizes the Clarafai Apple SDK to become an even more convenient and quick method of tracking nutrition. The team currently has an outline for the app that allows the user to take and store a photo and will soon be able to Utilize the SDK to upload the image and display the nutrition facts to the user.