Blocking out or "hiding" unhealthy options while grocery shopping through AR.

What it does

An android application that take in pictures/ videos and communicates this to a Qualcomm DragonBoard. The Snapdragon then calls IBM's Watson, which then recognizes the different foods and uses OpenCV to black out unhealthy options.

How we built it

The android app was written in Kotlin, Python Flask was used to call Watson, and Python was used to communicate with the DragonBoard.

Challenges we ran into

The biggest challenge (still) is to take the image taken by the android app and sending that to the Snapdragon over a web socket.

Accomplishments that we're proud of

Writing the android application and using OpenCV to recognize the different food images + contouring and blacking them out.

What we learned

Using Python, OpenCV, and the IBM Watson API. It's also useful to not add extra components to a stream line and that android was not made to serve images.

What's next for NutriBuddy

Live video streaming to Watson for image recognition.

Share this project: