While our team intended to work with Amazon Alexa, our plans changed when we couldn't rent out an Echo Dot for demoing. And despite having no experience working with Google Home or Google Actions, we were determined to create a project that is just as wholesome as helpful, leading us to rent out the Google Home instead. And within a few hours of finding a unique action that google hadn't implemented, "Can I eat it?" was born.

What it does

When a user wants to find out if a certain food contains possible allergens such as gluten, soy, shellfish, fish, dairy, tree nuts or any other kind of possible allergen, they can simply ask Google Home "Is there gluten in Hersheys?" and it'll return with statements confirming or denying the presence of the allergen. You can ask "Is there milk in cheetos?" or "Is there dairy in nacho cheese?". The list goes on! You can use ask multi word ingredients like "Can I eat caesar salad if I'm allergic to green peppers?". Even goofy questions like "Is there chocolate in chocolate?". Oooh, the possibilities!

How we built it

Using the power of Google Home through API.AI/Dialogflow we took advantage of the convenience of speaking through smart hub devices to handle talking to an API outside of the Google domain. By using the USDA national food database through their public API, we've been able to make use of the ability to query an allergen and check a given food item to see if it contains the allergen specified. We accomplished searches using the very same API that structures searches based on relevancy.

Challenges we ran into

Our greatest challenges were in the first 5 to 6 hours of starting the project. Despite there being four of us, we spent hours researching how to work with google, implement an API, and create both friendly and directive dialog. For the majority of our team, the lack of experience working with an assistance hub slowed us down. But once we figured out API implementation and how to build intents we got the ball rolling.

Accomplishments that we're proud of

We are excited to hear google respond to us when we prompt it and actually get feedback when we feed it an allergen and food to check. It's amazing for us to audibly and vocally work with something that we have spent hours typing code into. So for us, as a team, one of the biggest achievements was hearing those responses out loud for the first time. Apart from that, we really struggled with API implementation since it lacks documentation, so when we could finally access the USDA database to make queries, it became so much more than the Entities we were working with because we finally had Fulfillment.

What we learned

We learned a lot about the platform and about working with assistance hubs in the first place. None of us in our group had prior experience working with Google's Home assistance hubs and most of us had little experience coding in javascript or using dialogflow. It was an amazing learning experience for all of us and even the member that had some experience working with an Alexa assistance hub got to see a whole different side of it, with only prior experience with Alexa.

What's next for Can I Eat This?

We would love to be able to process multiple allergens at the same time and check multiple products. It would also be great to find a way to determine if a food is vegetarian, vegan, fairtrade, or celiac safe.

Share this project: