As high-schoolers (and sleep-deprived hackers), we have become more and more aware about the impact that unhealthy foods have on our bodies. We often say that we constantly check the calories we consume and make reasonable plans to exercise daily. But, how many people actually follow through on what they say?

Instead, discovering what is best for our body and making the right decision before purchasing food would be a valuable tool in making people healthier. We decided due to the recent burst of AR technology that people would be more willing to view nutritional information in a interactive and easy way by simply holding up their phones to any food in the market.

What it does

First, the user has to tap on an area on a camera view on their android phone. Then, the app scans the area that was tapped and determines the category of the item and whether the product is a specific brand or not. Then, it uses online resources to determine the nutritional information of the food and displays the information on the camera view on the UI.

How we built it

First, we brainstormed exactly how the app would work. Afterwards, we started working on the UI of the Android app. Then, we used the Google Vision API to assist in identifying the grocery item that the user selects. We used accuracy values, as well as connections between the product logos and web entity searches, to accurately compile a list of possibilities for the selected item. Next, this list of items was run through a nutrition database, known as NutritionIx. However, to process this large amount of data, we wrote a node.js method that would sort through the nutritional values that was in a JSON format. Finally, we created an AR view to display the information in an interactive manner. (Then we slept...for one hour).

Challenges we ran into

3 out of 4 of the group members were amateur programmers (one only knew java and c++, another knew python, and the third knew arduino). We argued incessantly over our idea well into Saturday and we started programming quite late. Also, at first, we decided to build the app using the AR kit on iOS 11, meaning that we were learning and using Swift. Then, we switched platforms and started using Android. This meant that we had to develop new JSON search methods (though it was admittedly easier, since we had just done it in Swift).

Accomplishments that we're proud of

The 3 amateur programmers learned new languages, such as node.js, Swift, and Kotlin. The one advanced programmer learned not to team up with us.

Tee-hee ;)

What we learned

We learned that at Hackathons, and any time-constrained event, team members should NOT argue over ideas, for an excessive amount of time. Debating ideas helps make them better, but spending too much time arguing means that we work under more stress (and less sleep).

We also found that learning a new language through experience is much easier than using an online class. Especially with node.js, we noticed that there was a steep learning curve, where, the language seemed to become much more streamlined as we continued using it.

What's next for GrocARy

We are in the works to use machine learning to scan the history of products the user has gotten to decide on better alternative products they may purchase in the future.

Built With

Share this project: