Inspiration
Our inspiration for Cookipedia came from the common experience of having ingredients at home but no idea what to cook. We wanted an easy way to generate recipe ideas tailored to what's already in your kitchen.
What it does
Cookipedia allows you to simply snap a photo of any ingredients, and its AI will instantly identify them and match them to delicious, customized recipes. No more wasting time and food staring blankly at your fridge!
How we built it
We used a combination of deep learning models and other tools:
Panoptic segmentation model to detect and segment ingredients in the photo Image classification to identify each ingredient Large language model to generate recipe text based on the ingredients Flask for the backend API Streamlit for the frontend UI
Challenges we ran into
Finetuning the segmentation model to work well on food items Learning web development skills to build the frontend Connecting all the pieces from image processing to text generation Collaborating remotely as a team
Accomplishments that we're proud of
Achieved 80% accuracy on our image classification model. Created an end-to-end prototype from image to generated recipe. Learned a ton of new deep learning, web dev, and collaboration skills. Worked seamlessly as a remote team and successfully combined our efforts.
What we learned
Implementing and deploying deep learning models. Web development with Flask and Streamlit. Using large language models for text generation. Remote teamwork and collaboration. Solving complex problems by breaking them down and combining solutions.
What's next for Cookipedia
Continue refining the AI models and architecture. Build out a full web and mobile app. Expand the ingredient database and recipe catalogue. Bring Cookipedia to more mainstream users and reduce food waste!
Built With
- css
- flask
- html5
- javascript
- machine-learning
- openai
- python
- pytorch
- streamlit
Log in or sign up for Devpost to join the conversation.