Inspiration

Food is often wasted when ingredients go bad after we fail to keep track of them in the fridge, or if we overcook them due to inexperience, or fear of potential health hazards from undercooked foods. The United Nations' second Sustainable Development Goal is to eradicate hunger, and our part in reducing food waste and improving resource allocation contributes to this goal.

What it does

With Pocket Mama, we use Microsoft Azure's sophisticated ML algorithms to personify the cumulative cooking experience of many years of moms - culminating in the abilities to classify whether or not ingredients are fresh, and evaluative feedback on how to perfect the final masterpiece, preventing undercooking and overcooking. Beyond these two practical abilities, we revolutionize traditional interactions with recipes (think text instructions, images or even videos) by incorporating customized evaluative feedback based on just a simple photo of your current step, and providing that warm "Mama" connection.

How we built it

We used Microsoft Azure's Custom Vision and Computer Vision models to process our images. We first used Azure's Computer Vision models to identify the food objects in focus, and then processed these results and piped them into Custom Vision models to determine how fresh or cooked each item was.

We then processed the signals and classification labels on a Python Flask server, and incorporated PyPlot to draw boxes around the various food items detected so that users could visualize the classifications supplied from Azure's models.

Finally, users get to interact with a well designed front-end interface made with Wix and supplemented by custom Javascript code that enables calls between our intensely intelligent back-end. This interface is friendly in both desktop and mobile platforms, allowing users to upload images taken from their phones as they cook.

Challenges we ran into

Wix design interfaces were aesthetically appealing but really hard to code relative to traditional HTML/CSS and implementing our custom Javascript calls that sent images over distributed systems took a lot of time due to lesser documentation.

Accomplishments that we're proud of

We think our idea of interactive recipes with customized evaluative feedback is pretty novel and has a lot of entrepreneurial potential, especially considering its positive externalities on society in terms of reducing food wastage, and improving nutrition when users cook their food right. It's also a fun way to play with food!

What we learned

Azure's ML models are very versatile and could work well with small amounts of training data, like 10 labelled images. It is an incredibly powerful tool, especially in making static things interactive.

What's next for Pocket Mama

This idea starts in the market for teaching inexperienced cooks to learn recipes through a hands-on guided approach. However, our idea definitely has potential to expand to the entire recipe\cooking industry in providing this interactive experience that cannot be substituted by text, photos or even cooking videos.

There are many different features that we hope to add to Pocket Mama, including:

  • suggesting substitute ingredients
  • sharing tips and tricks from different chefs around the world
  • visually determining the quantity of ingredients needed
  • using augmented reality to give feedback in real time

We hope to make Pocket Mama a one-stop-shop for all first time and inexperienced home-cooks, almost as if they had their own mother to oversee them as they cook.

Share this project:
×

Updates