Every time I ask someone, "Where do you wanna eat?", I get responses such as "whatever" or "you pick". Since I get these ambiguous responses, I decided to solve my problem as well as many other people's problems.

What it does

Users can either input a search query or just hit the "Pick For Me" button and it displays pictures of food from restaurants near the user. Users can swipe right to like and left to dislike and when they are done the results will be displayed according to the user's swipes. With the top row being the restaurant the users swiped right on the most. On the restaurant page, users can call, messages friends through iMessage integration, and share on social media. They can also see the rating, distance away from restaurant, tips, and photos. When the user is ready to go, they can hit go and they'll be on their way to the restaurant.

How we built it

We used firebase for the backend to store the searches and the user information. We utilized Foursquare endpoints to find restaurant data and then used the Google vision API to sort the images for food. Later, we integrated iMessage, calling, sharing and the navigation capabilities to make it convenient for the user rather than going to multiple apps to do each thing.

Challenges we ran into

It was a challenge to simplify the user interface, make the swiping interface, and implement the Google vision API.

Accomplishments that we're proud of

I'm proud of putting everything together because I used so many different APIs and had to figure out ways to combine everything.

What we learned

I learned how to use many APIs and put them together without breaking the other one.

What's next for Foodies

I plan to use machine learning algorithms to predict what users want every time they use the "Pick For Me" button

+ 4 more
Share this project: