Many of us can relate to the feeling where we know that we are hungry, but just don't know what to eat. A lot of us have a hard time choosing food because there are too many options out there.
What it does
Uses a picture of your face, or a short passage of your thoughts, and using the Google Cloud Vision API and the Google Cloud Natural Language API, the app determines your mood. The app will recommend a list of foods you may want to try based on that mood.
How I built it
The app was built using python, with Flask as the back-end. Static html/css pages are used in the front-end.
Challenges I ran into
Getting the APIs to work, as well as creating the front-end (with the UIs and what-not).
Accomplishments that I'm proud of
The fact that it works
What I learned
How to build a web-app using Flask in the back-end
What's next for Hangry
Try to support more feelings and emotions (limited as of right now). Support more foods, and varieties. Move to a database that isn't just a csv file. Make the app learn user preferences Integrate with delivery apps