I always open up the Yelp app intending to find a great place to eat that meets all my personal needs. But I always end up getting so many results, having to balance factors from cost to reviews to cuisine to distance, I end up just picking something randomly off the first few restaurants on the list.

What it does

Provides a conversational interface to food-related data sources, ranging from Yelp to food safety scores: compiling them into a single personality. It makes small-talk, gives information on restaurants, collects user food preferences and habits, intelligently suggests restaurants and corrects accordingly. It also has cool UI tricks to make usage smoother: such as simply snapping a pic of nearby stores to get a quick and frictionless overview.

How I built it

Heaping together eclectic data sources, online APIs, online cognitive services, custom neural networks and other machine learning models such as random forests - all while building as smooth and intuitive of a conversational UI as possible.

Challenges I ran into

Model: 89 hours ETA (training) Hackathon: over in 12 hours

Accomplishments that I'm proud of

Actually pulling everything together. I've competed my past half dozen hackathons in teams, and it's always turned out great, but I really wanted to see how far I could go on my own merit.

What I learned

How to use Microsoft cognitive services, Azure - had to debug some low-level Tensorflow-Keras issues. Also learned about a bunch of neat datasets.

Built With

  • theano
  • keras
  • python
  • cython
  • microsoft-cognitive-services
  • azure
  • opendatala
Share this project: