We were looking for a project which would allow us to utilize interesting technologies
What it does
NutriTrack is able to decipher the nutritional information of a food item within a picture, which then displays analysis for the user in the form of graphs and charts on the web portion of the app.
How I built it
NutriTrack utilizes the Clarfai api in order to categorize various images food items captured by the android companion app, with the nutritionix api providing the user with the food's nutritional values. Firebase Storage and Realtime Database were used in order to sync data between the web and mobile parts of the app.
Challenges I ran into
Both of us were new to Firebase so it was very difficult for us to get it working how we wanted. We spent hours trying to upload images from the android camera to firebase storage, and many more trying to sync key information between the web and mobile parts of our app, using the realtime database. We had also never used android studio before which made it quite difficult get the portion up and running.
Accomplishments that I'm proud of
We are very proud of the fact that we were able to create a nearly seamless cross-platform experience, through utilizing technologies that we were unfamiliar with.
What I learned
We learned how to transfer data between a mobile app and web app using firebase as well as how to utilize the camera with android studio in order to upload images to firebase storage.
What's next for NutriTrack
In the future we hope for the mobile portion of NutriTrack to garner most, if not all, of the features of the web portion along with new additions such as a log in system and more analytics. We also hope to provide many more customization options which will help the user make the app their own.