Inspiration

We brainstormed and came up with six different from fall detection of the elderly to GPS-sticker tracking. Our final idea wasn't even one of these options. By mere chance, someone asked the question about scanning food. We all thought it would be a cool idea to be able to take a picture of a meal and get its nutrition facts. We all decided to take a risk and start the project that would become LiveFoods.

How we built it

We used Firebase to process images and find potential objects contained in the image. We also used Azure Table Storage to store a database of different foods and their nutritional values. Once the Firebase API found a food object, we retrieved that object's information from the Azure Table and displayed on the screen. We used the Java language in Android Studio to create our project. There are two different options for the user: they can take a picture and get data on their food, or they can analyze food as the camera moves.

Challenges we ran into

First we weren't able to retrieve Azure Table entities in Firebase. We spend hours trying to fix the problem, and someone from the Microsoft booth even came over to the Firebase booth to help us. In addition, there was a problem with the different threads in Android Studio. We are unable to call the Azure Table directly through the main thread, but the biggest issue was on how to improve the speed of the Cloud Server Image processing. Eventually, we realized that by asynchronizing the main thread with the thread that called the Cloud Server, we were able to improve the speed of the application.

Accomplishments that we're proud of

We were able to successfully complete and run an Android application that exceeded our expectations. Given that we have zero experience in Android app development and Cloud-based data storage, we did not initially expect to create a product that actually worked.

What we learned

We learned how to take abstract concepts that we have never touched before and create a product out of it.

What's next for LiveFoods

  • Direct import into diet tracker
  • Ability to share food with other users
  • On device Machine-learning to improve accuracy of image recognition
  • Include other nutrition facts
Share this project:

Updates