Inspiration

I'm the father of a 7 year old Type 1 diabetic daughter and am responsible for making sure I count the carbs of every food item that she eats so I can give her the right amount of insulin. But getting the right carb information at restaurants can be challenging given the confusing websites each restaurant has. So my vision is we should be able to leverage the latest in AI technology to help us get an accurate count of the number of carbs, any food item has, with the fewest number of clicks and avoid SWAG-ing (Scientific Wild Ass Guess) at all costs.

What it does

Swagz is an app using which Type 1 diabetics and their care givers can simply take a picture of the food item and Swagz recognizes the food item in the picture and based on the GPS location of where the picture was taken pulls the carbs information and provides it to the user

How I built it

I built the application using AngularJS and Azure Services such as CustomVision Cognitive Services, Blob Storage, Azure Functions and Azure SQL database. The most work involved was in training and deploying the cognitive services api using customvision.ai. I took lot of pictures of each menu item during a recent trip to Texas Roadhouse and used those pictures to train the model to identify those menu items. I used Azure functions to help the app interact with the database which contains the restaurant GPS locations and the carb information of all menu items.

Challenges I ran into

Firstly, I was new to the Azure Cognitive Services and was unable figure out the right URL format to query the API and get the results of the model. I had to take the help of Azure experts and also some trial and error method to get the right URL.

Secondly, I thought of leveraging the default object detection api to recognize food items and soon realized it was not very specific. Thats when I started working on building a customvision model using customvision.ai where I built a project, uploaded pictures of the food items and trained the model.

Thirdly organizing all the different moving pieces through the AngularJS app was a challenge which required lot of hours of coding and debugging to get it right.

Accomplishments that I'm proud of

I was proud to finally build something that is innovative and useful for Type 1 diabetics. I know there's a lot more to be done to make this app market ready but the key infrastructure has been built and I just need to add security and error handling to make this app more robust.

What I learned

Training AI Vision is very difficult. To create separation between food items which look very similar like fries and chicken nuggets, we need lot more pictures to train the model adequately. The best way is assist the algorithm with other context information like GPS, time, personal habits to get more accurate with the identification.

What's next for Swagz

I want to build it into an iOS app first. Train more menu items for the most popular fast food and restaurant chains. Build integrations with other diabetes apps such as Night-scout and Tide-pool.

Built With

Share this project:

Updates