Inspiration
The inspiration for Munch comes from our challenging experience living in a cramped triple dorm last year. With limited space, our own mini-fridge quickly became a source of frustration, often overflowing with forgotten leftovers, giving birth to factor 1. Our floor in the building was shared by over 50 people, who kept their groceries in the common floor fridge and often forgot about them. By the end of the year, it all piled up and our floor's shared fridge was so overstuffed that it broke down, leading to a $75 repair charge we all had to bear, birthing factor 2.
These 2 eye-opening experiences revealed the common issues of food waste and mismanagement in shared living spaces. Determined to find a solution, we envisioned an app that streamlines meal planning and helps users manage their groceries efficiently. Munch aims to create a supportive community around food management, helping students reclaim their kitchen space and foster a sustainable approach to consumption.
What it does
Leftover groceries in your dorm and not sure what to make? Munch is a recipe app that allows users to search for recipes based on the ingredients they have on hand. Just point your phone camera at your groceries - Munch will automatically detect the ingredients and suggest recipes sorted by preparation time. Find something interesting? Munch will give you hands-free cooking instruction on every step of the way. Munch is the perfect solution for college students who are looking to make a quick and easy meal with the ingredients they have on hand.
How we built it
With the help of LangChain and leveraging Google's flagship generative AI models Gemini 1.5 Flash-002 and 1.5 Turbo, we were able to feed the image and get a list of all the ingredients in sight in the photograph provided. With the help of a thoroughly engineered query and Gemini, we were able to list out a bunch of different recipes using the ingredients at hand, categorized by Breakfast, Lunch, Dinner and Snacks, with each recipe being listed with its preparation time. The user can then select which recipe to make.
As for the steps, we utilized Hume AI's Empathic Voice Interface (EVI) API with a custom system prompt to be as concise with the steps, while forming complete and coherent sentences, easy for the user to understand. With the EVI's help, the user may request to Restart the recipe, skip or go back to another step, get enhanced details on any particular step, or anything else one might think of while cooking with the help of Gordon Ramsay (excluding "idiot sandwich"). If the user is unable to gain access to the audio in any certain circumstance, they will not be limited to text-to-speech, with a set of instructions as they would hear from the voice model, available on screen.
If the user wishes to not use one of the recipe's provided, then they can input the URL to that craft using Perplexity Ai's integrated fast "Answer-engine". The engine will scan the website and generate the name, description, ingredients and steps, and finally allow for hands-free access to that recipe!
We came up with a quick, minimalistic and elegant design for the app, with the help of Figma. We used Flask as a mediator to integrate all the files together and Flutter to lay the foundation for Munch.
Challenges we ran into
Our front-end development hit several roadblocks, primarily with the Hume AI integration into flutter. The unreleased Flutter SDK for Hume posed a significant hurdle. A lack of documentation left us navigating through uncharted territory, making it extremely time consuming to implement Hume's emotional analysis features.
We also faced dependency issues with Android packages, which caused multiple compatibility problems. These conflicts meant we had to troubleshoot package versions and configurations, which ate into our development time. Additionally, we struggled with getting a reliable camera feed on Android emulators. This was crucial since our app relies on camera input for scanning groceries, but the inconsistencies in the feed, particularly on Android, disrupted the flow of our testing and user experience.
For the hands-free step by step guidance, we initially used Deepgram for Speech-to-Text (STT) and LMNT for Text-to-Speech (TTS) to enable hands-free guidance, but integrating them became too complex and time-consuming. Coordinating voice input and output without delays or errors added significant overhead, limiting our app’s flexibility. Managing dependencies for both STT and TTS created performance challenges, leading us to search for a more efficient solution.
Despite all these hurdles, we had a fun, enjoyable, and memorable journey.
Accomplishments that we're proud of
Despite all the problems faced, the team was able to develop a fully functional mobile app with cross platform support in under 48 hours. It just goes to show the resilience and will to surmount problems no matter what. Despite just the 2 of us working, we displayed how passionate we were about building the project, working through the night, keeping each other company. Seeing it work through the stages of development fueled us with motivation to keep going and work till the end.
What we learned
Throughout the development of Munch, we gained valuable insights into both technical and user experience challenges. We learned the importance of effective integration strategies when dealing with third-party APIs, especially with the unreleased Hume SDK and its limited documentation. This taught us to prioritize clear documentation and community support when choosing tools for future projects.
We also discovered how crucial user feedback is in shaping features and functionality. Iterative testing allowed us to refine our approach, ensuring that Munch truly meets the needs of college students. The journey highlighted the importance of adaptability—pivoting from using Deepgram and LMNT for voice guidance to a more streamlined approach with Hume's EVI, ultimately enhancing our app’s usability.
Finally, we recognized that collaboration and communication within our team were vital for overcoming obstacles. Our shared passion for the project helped us stay motivated and inspired, reinforcing the belief that resilience and teamwork can lead to meaningful solutions.
What's next for Munch
We had a lot of ideas that we wish to formulate and bring to fruition with Munch, starting off with:
1) Support of YouTube video recipes: While the initial feature of adding support for internet links to recipes was essential for broadening the scope of recipe suggestions, we believe that many people, especially in the digital age, turn to YouTube for cooking tutorials. With Munch, we want to take this convenience a step further by integrating YouTube video support directly into the app.
2) Integrated Log-in System: As of now, our product runs without the user having to log in, reducing the click off rate. But some customers would wish to store their information long term, and some might be returning users. Thus, to ensure their data does remains intact, a log-in system will retain all the recipes they have ever added to our database
3) Grocery Management System: As a part of our motto, working in a time crunch serves to a high priority. Misplacing and then overbuying groceries is a big problem, and we want to help you. After implementation of a log-in system, we will be able to help you keep track how many groceries you have, ask for weekly updates and then when you're running low, suggest you go shopping at the nearest wholesale market.
4) Expanding beyond basic college recipes: As much as we value your time and provide you with the quickest recipes, we also wish to cater to people who derive their hobbies and passions from cooking. Thus, we wish to implement more databases, with more professional recipe from chefs across the globe, thus allowing you to convert your home to a Michelin star restaurant, all through one delectable meal.
Log in or sign up for Devpost to join the conversation.