Inspiration

Karen thought up the idea of recipe-master because sometimes people will have no idea what to cook although having sufficient ingredients. So for the user's convenience, we decided to use picture recognition API to let our app know about the items list and send new request to those recipe-finding API to obtain recipe relevant to your item list.

What it does

Due to some technical misfortune, this version of the app only provides the part that you can obtain your fridge item list by taking one picture at a time without typing any words to set up your item list.

How I built it

Firstly, we searched for APIs for picture recognition and recipe finder, and tried to upload the picture while sending the HTTP GET request to those APIs on iOS platform, and give more filters to further make the search result of recipe match more to the user's situation.

Challenges I ran into

I did not find that my iphone(iOS 10) and Xcode were basically incompatible with each other until 10 hours ago. The problem went bigger as I updated the Xcode and Swift version when the pods such as Alamofire(update from 3 to 4) I used in my previous version all went incompatible with their original syntaxes, and so got to go through all the documentation again to update the code.

Accomplishments that I'm proud of

As the APIs which we used were not so well-documented, I've run into several scenarios in reading the very source code the these APIs, and navigated all the way by myself in applying API methods for iOS platform. Also,

What I learned

How to tackle the problem with API, practice more in applying third party API in doing iOS app, and learn about the latest version of swift, xcode, and pods such as Alamofire.

What's next for recipe-master

Finish the recipe suggestion part and went further into the algorithm possibly so that in the future taking one picture of the fridge will be sufficient for the app to know all of the food in your fridge.

Share this project:

Updates