It is estimated that over 15 million Americans are affected by similar food restrictions, and many more abide by strict diets as a result of medical concerns, such as breast cancer or Crohn’s Disease. On a more personal level, a member of our team recently had an allergy scare after eating a snack with the nutritional label in a foreign language, further bringing this issue close to home for us.
What it does
Suppose you were traveling internationally and you walk into a foreign grocery market to buy some food. The problem is you possess a life-threatening peanut allergy, but cannot read the allergen content written in a foreign language. AllerFree stores a database of all your dietary restrictions and detects for them in snapshots of multilingual nutritional labels taken on your mobile phone. With AllerFree, so long are the days of fear over potentially triggering your life-threatening allergies.
How We built it
The user flow begins with a native camera application developed in Swift. This application sends the captured image to the deployed Python Flask app via a REST POST. The Flask app itself is deployed using the Google Cloud Suite; however, its HTML front-end was replaced with custom request handlers for the app to upload the captured image straight into Google Storage Bucket. The Google Vision API comes into play with scanning the image text. The Google Translate API then converts the image text into English. The comparisons for the particular strings are then computed on the Flask webapp, which also accounts for discrepancies in plurality. The results from the Flask app are then returned to the Swift app to present to the user as an alert.
Challenges We ran into
A major challenge we faced was the successful integration of REST APIs between our Swift and Flask components. In addition, maintaining elegance and simplicity in our solutions was important, which involved re-purposing the Google Cloud libraries to more appropriately suit our needs.
Accomplishments that we are proud of
With so many features across many unique APIs, it was crucial to have robust handshake protocols between the Swift front-end and Flask back-end. In doing so, we were able to utilize the Flask application to provide scalability for feature integration. This resulted in an efficient solution using both Google Vision and Google Translate to maximize the power of cloud computing. Similarly, implementing this method paved the way for our division of labor within the team, since we could then focus on the specific pieces that contributed to the overall user experience. Through this, we also managed to re-design key components of the Google Cloud libraries to work with our HTTP POSTs from Swift, sending images through an HTML form payload to the Flask app without requiring an external upload or any browser interaction beyond the native app.
What We learned
The largest accomplishment we achieved - integrating all of these APIs and frameworks to work together - provided us the opportunity to fundamentally learn these concepts. Additionally, getting the REST API requests to work between the Swift app and Flask back-end allowed us to understand the functionality and implementation of APIs between multiple tools. Finally, we gained a greater understanding of image processing by investigating the techniques involved, including deskewing and binarization for image quality.
This being our first hackathon, it took some time to get fully acquainted with our workflows and collaborate on a project.
What's next for AllerFree
An interesting feature to investigate involves tracking nutritional information throughout the day from these images and providing analytics and data visualization to the user. Additionally, we would like to be able to have the dietary restrictions input from the user support other languages. Finally, there connecting a live database of all dietary restrictions that result from medical concerns would enables the user to perform the same functionality from an image with only a medical issue, rather than the specific ingredients themselves.