What it does

We let users strike up a conversation with Loki about any topic concerning their local surroundings. On each query, we carry out natural language analysis and attempt to use the Yelp API to allow people to search for places and get recommendations in a conversational tone. For anything more complex that requires more knowledge than what Yelp offers, we forward the question on for other people in the community to answer, ensuring a timely and accurate response from the people who know their community the most.

Update: We're also using the Google Vision API to allow you to search Yelp for things you take a photo of.

How we built it

We used node.js on the server-side along with Express and socket.io. On the front-end, we used react-native as working with the React states model allowed for far faster iteration and development.

Challenges we ran into

As none of the team had any iOS knowledge, building the iOS app was a challenge in itself! We ended up using react-native, and we're pleased with the results that we experienced.

What we learned

Natural language analysis is hard! In addition, we spent a lot of time at the start of the project trying to use languages we were unfamiliar with, and found that once we reverted back to familiar technologies progress continued rapidly.

What's next for Loki

Refining the natural language analysis and sentiment work we're doing, and allowing responses (e.g.. "find me somewhere more upmarket") to our curated suggestions.

We're targeting the Yelp challenge with this hack.

Share this project:
×

Updates