Inspired by calorie and nutritional tracking applications, this google assistant app aims to make it even easier by integrating it with voice input as well as a potential for the regular barcode scanning component if used on a device with a camera. Making it easier to track food and nutrient intake will undoubtedly encourage more and more people to attempt and meet their fitness goals.
What it does
This application currently has limited actions. At this moment in time it can take in input about a food and read back the calories as well as the macro nutrient composition. Look to what's next to learn what we have in store!
How I built it
Development started with a project on Dialogflow in the Google Actions console. After setting up all the necessary entities and intents, 3 major actions were created. First is to log a food by saying an amount, unit, and food.This is then sent through an api call to get the nutritional information which is then parsed for the calories and macro-nutrients. The second major action is to ask for the nutritional information that day. This does a query in the database and adds all the amounts respectively, returning either a single macro-nutrient or all the information together depending on what is requested. Finally the user can set new goals, updating the database with their goals; comparing their goal to what has currently been eaten that day.
Challenges I ran into
Using completely new languages and tools which were not designed to be used with each other made integration hard. The first major challenge was not being able to get user information from the google assistant since Firebase was not used. The second major challenge was permissions and integrating a separate database on a platform that was not necessarily designed to connect to one.
Accomplishments that I'm proud of
Although I am wish I could have developed 2 or 3 more of the features I am planning on implementing, I am proud of having a running application that is making both api calls and connecting to a database. Learning to use all of these technologies was a wonderful experience which will help me further develop this application in the future.
What I learned
All the technology used in this project is new. Working for the first time with the google assistant development environment I learned how to build entities, intents, connect a webhook, and develop a conversation with machine learning. Having never worked with Node.js before I was also able to learn a bit more of a popular language.
What's next for KeepMyCal
Due to major time constraints this app only has one of the many features it will eventually have. After completing these features as well as cleaning up the user interface, I would like to implement a version which also tracks micro nutrients, displaying a chart of your daily intake when used on a device with a screen. Eventually the app will aim to track all users, their goals and intakes, and make suggestions to them as well on what foods to eat to meet the goals they have set.