Inspiration

The inspiration behind the project is our curiosity about different types of meals. We figured out that we could create an application that is able to detect different types of meals and provide information about them in real time.

What it does

DetectMeal is an android application that detects and provides information about different types of meals in real time.

How we built it

We made use of a machine learning food detecting mode from tensorflow hub. We integrated the model with our android application using MLKit and worked on getting the camera to create a bounded box over every food image it comes across as well as display the food name and the confidence level for each food item. We made use of the CameraX library for android applications. We also made use of Lottie for the animations in the application.

Challenges we ran into

Figuring out how to draw the bounding box when the camera detects a food

Accomplishments that we are proud of

  • The application works and it is able to detect and provide information about different meals (Limited to North American meals for now)
  • We were able to deploy the application and people are able to download and test it using their android applications

What we learned

  • How to use tensorflow models in android applications using ML Kit
  • How to use canvas for drawing in android applications
  • How to make use of the cameraX library to make building camera applications easier
  • How to use Lottie animations in android applications

What's next for DetectMeal

  • Publish and deploy on Playstore
  • Make it cross-platform

Built With

Share this project:

Updates