When James wrestled competitively in high school, he was neurotic about his dieting, since the balance between weight and efficient muscle mass is crucial in the sport. He studied the nutritional labels of everything he ate during the wrestling season. However, logging these numbers was inconvenient and he tried to estimate his caloric intake instead.

Tracking one's diet is expected of any competitive athlete in any sport, or anyone looking to stay on a diet, and it could be made easier through streamlined logging. Additionally, since half the team owned an Alexa device, we wanted to take advantage of the hands-free technology as well.

Thus, Food Cycle was born.

What does Food Cycle do?

Using our application, the user can take a picture of his or her meal using an iOS phone and accurately log the nutritional information of the food automatically. Stored per day, this nutritional information includes calories, carbs, cholesterol, sodium, sugar, protein, fat, and meals. Food Cycle also implements an Alexa skill to relay the nutrition log recorded by the application and vocalize it to the user.

How was Food Cycle built?

We worked on the backend programming of the application with XCode and integrated it with Firebase to transfer nutritional data. We utilized Core ML as the framework for the machine-learning model used in photo-recognition of food, relating it to the food101 API to add nutritional data. Additionally, we worked on the frontend with Alexa Developer Console and connecting it to Firebase, with help from Shivany Shenoy of Panera. We wrote the dialogue commands (intents), and modified the JavaScript code through Alexa-Hosted AWS.

What are some problems we ran into?

It proved difficult to connect the frontend from the Alexa developer console to the backend of Firebase, as well as retrieving the correct data, due to our limited experience with Alexa skills and JavaScript. Shivany Shenoy helped with this process.

What are some accomplishments we're proud of?

Aside from the challenges, we surprised ourselves with how quickly we made progress in a working photo-recognition app and the complementary Alexa skill. Our team split into two, working independently on these two aspects- this vastly improved our efficiency, suiting our strengths and interests. We also believe this is a diverse project, utilizing a phone app, photo-recognition, and an Alexa device, all in the sake of making tracking less inconvenient.

What did we learn?

None of us had used the Alexa Developer Console before, so that was an enjoyable learning process. In JavaScript, we learned about the async and promise functions. Furthermore, we learned about iOS development, implementing API's into iOS apps, using cloud functions for Firebase, and creating a Go server.

What's next for Food Cycle?

We can implement nutritional goals, as well as tracking the percentage of the daily goal we are at, such as a caloric limit or protein minimum. We can also program Alexa to provide optimal meal options catered to your needs upon request. In terms of photo-recognition, we can train our own food model to identify certain foods, such drink containers, more accurately than food101.

Built With

Share this project: