Inspiration

  • For the health fanatics, fitness enthusiast or foodies it helps to know what you're eating
  • Most of us not eating healthily in lockdown (be honest, you aren't)
  • Want to have a new way of tempting ourselves to eat healthier
  • Also learn more about nutrition at the same time

What it does

  • Uses your mobile phone camera to detect foods using computer vision and displays 3D models of nutritional value in augmented reality
  • Easy to use. Just point your camera at a fruit and try!
  • Family friendly. Children can use it to learn. Suitable for all ages.

How we built it

  • Google Cloud AutoML Computer Vision for recognising fruits
  • C# and Unity to display 3D models stored with echoAR
  • MATLAB to build 3D models from nutritional information
  • (Additional) Java and Android Studio for Android version

Challenges we ran into

  • Little prior experience working in Unity / Android Studio
  • First time of using echoAR

Accomplishments that we're proud of

  • Training a ML model to distinguish different types of fruit
  • Learning to incorporate multiple APIs and cutting edge technologies
  • Displaying models in augmented reality space using Unity and C#
  • We had fun building an interactive fully functioning app (some for the first time)

What we learned

  • Choosing the right platform
  • Quickly picking up new frameworks/libraries and putting them into use
  • Learning to use ML models and google cloud computer vision API

What's next for Fiberr

  • More types of food
  • Alternative designs
  • Faster detection rates

Built With

Share this project:

Updates