Our project was actually inspired by the name of this hackathon, which was New Year's Hack 2021. One of the most common resolutions people make is to eat healthier, which is why we decided to make an app that would be able to inform and educate people about food.

What it does

Our project is able to provide users with recipes based on certain ingredients they search for. Our app is also able to identify ingredients, such as fruits and vegetables, from a picture taken with the phone's camera.

How we built it

We built this project using react-native expo, and we also used a lot of APIs such as recipepuppy for recipes and clarifai for object identification.

Challenges we ran into

There were a lot of small challenges we had to overcome, however one of the main setbacks occurred when we tried to use our own custom tensorflow model, and due its large size, it caused a lot of issues since our application was meant to be run through a mobile app.

Accomplishments that we're proud of

One thing that we are proud of was our group's ability to quickly find an alternative to our tensorflow model challenge mentioned above. We did this by finding an API that we used to detect the object in the camera.

What we learned

We learned how useful react-native was in the world of mobile app development and it was an interesting experience for us because none of us really had any prior app development experience. We also learned how to create a transfer learning model in tensorflow and we were able to save and export it to javascript. However, unfortunately, we were unable to use it in our final app due to its large size.

What's next for FoodFind

We are thinking of launching it to the google play and app stores in order to make it available for everyone.

+ 15 more
Share this project: