Hackers
Our FemmeHacks team members are Rhea Kapur (Pingry '21), Eva Schiller (Pingry '21), Olivia Taylor (Pingry '23), and Emma Huang (Pingry '21). Rhea, Eva, and Emma are Pingry seniors, and Olivia is a sophomore. We are all involved with the Pingry Girl Code club.
Inspiration
Our team of hackers is no stranger to food allergies. As such, we know that in addition to being an inconvenience, allergies can be incredibly dangerous. For an example, look no further than your average grocery store food label.
These ingredient lists have extremely small fonts and large words, making them a huge challenge for nearly anyone who struggles with allergies. From young children, to those with poor or partial eyesight, to those who do not speak English as their native language, scouring hard-to-read food labels to ensure they are safe for consumption can be an uphill battle with dangerous medical consequences.
AllerScan aims to relieve individuals with allergies of this daily burden. Users can simply take a photo of the food label, and we will scan it and inform them if the item contains any of their particular allergens. It's simple UI makes it easy for children to use, while its large font makes it accessible to those with poor eyesight. For non-English speakers, AllerScan can handle allergen inputs from four other languages.
With a quick, accessible, portable allergist in their pockets, we hope that those with allergies can go about their daily lives without the inconvenience of reading food labels, and more importantly, without the danger of misreading them.
What it does
AllerScan is an iOS App that tells the user whether a certain food or medicine contains their particular allergens. First, it prompts the user to choose their language and enter in their allergies. Then, the user can take a photo of any food label, and AllerScan will inform them whether or not it is safe to eat! If it isn’t, AllerScan will let them know which allergens are present in the food. This is a fast, easy, and portable way for those with allergies to quickly identify what foods are safe for their consumption.
AllerScan is not just limited to English-speaking users- it can be used by people worldwide, and for foods from around the globe. It’s especially helpful for travel - want to tuck into a delectable box of chocolates, but don’t know French? No need to Google Translate the ingredients word by word: enter AllerScan, and gone are your worst nightmares of reactions to “maybe marzipan” and hidden hazelnut.
How we built it
We used XCode and SwiftUI to build the iOS App and design the User Interface. For performing text recognition on an image (required for translating photo food labels and ingredient lists into words), we used Google Cloud’s Firebase ML Toolkit for iOS (specifically the vision segment). For translating text between languages, we used Google Cloud Translate services. For the camera integration, which allows the user to take photos of their food, we integrated the Apple AVFoundation’s AVCameraSession with SwiftUI. We built the entire app from scratch today using the MVVM design paradigm and designed the logo using Canva.
Challenges we ran into
One of our biggest challenges was integrating AllerScan’s back-end, where we translated the images to text using the Google Cloud Firebase ML toolkit and cross-checked them with user allergens, with our user interface. Most of the team is new to app development, so we needed to learn a lot about EnvironmentObjects and ObservedObjects before we could have effective communication between our back-end Models and our front-end Views.
Accomplishments that we're proud of
We are proud of the fact that today, we accomplished so many things that we never could have done the day before! From finding text in an image, to translating words to new languages, to making an App in the first place, today was a day of firsts for most members of our team.
What we learned
On the soft-skills side, this was our first time participating in a virtual hackathon! Collaborating on code virtually is definitely a new skill, and we are so glad to have been able to hone our skills in that regard. But on the technical side, we’ve learned even more! Coming into this challenge the majority of the team was totally new to app development. Thus, the entire project was definitely a learning experience, as we had to self-teach Swift as we went. On a more specific note, we discovered how to implement a camera service using computer vision software, and even figured out how to perform a text recognition / computer vision task with Google Cloud!
What's next for AllerScan
We hope to increase accessibility by integrating more language options, as well as develop our own database for ingredients that have an allergen, but do not explicitly name it. For instance, if you have a gluten allergy, we would like AllerScan to also identify items such as wheat flour and pasta.
Furthermore, allergies are oftentimes linked to other food groups - for example, almonds are a part of the prunus family that include peaches, a stone fruit. Our app currently identifies the user’s allergens on a label, and in expanding our app, we hope to also highlight related these kinds of “caution” ingredients in a given food label.
Built With
- avfoundation
- firebase-ml-toolkit
- google-cloud
- google-cloud-translation
- google-firebase
- swift
- swiftui
- xcode
Log in or sign up for Devpost to join the conversation.