The Problem
Deciding what food to make is challenging, especially when you’re living alone for the first time. As students, we’ve experienced the difficulty of planning what to eat for a week. When planning meals, it can be hard to know what to make: and keeping recipes on your phone while cooking can cause a mess. Nobody wants to touch a screen when they are cooking - it’s unsanitary and inconvenient.
The Solution
EyeCook is an augmented reality app that intelligently recommends recipes to users based on a picture of available ingredients. Snap Spectacles was the perfect platform for this since instructions, timers, and ingredients follow you wherever you look and you do not need to touch anything to interact with them. Our system gives step-by-step instructions for each dish, notes potential allergens, and provides a hands-free interface for easy use while cooking.
Our Technology
We built a Snap Spectacles 2024 AR Filter with Typescript and Lens Studio, an iOS app with UIKit, and a web application with React.js. Each of these frontend endpoints connects via an encrypted REST API to a multi-node Kubernetes cluster hosted on our Oracle Cloud backend, which is mirrored for full redundancy across US East and US West availability zones. Our server-side code interfaces with two commercially available generative AI endpoints. First, we leverage Google’s Gemini Pro 1.5 model (in JSON structured output mode) to identify ingredients in an image, build delicious recipes, and detect likely allergens. Next, we use Flux Schnell (via HuggingFace Spaces) to generate a descriptive image for each recipe. Once the server processes the user input, the user can view and edit the ingredients and see details on each recipe. In the AR environment, they can view steps hands-free and set timers while they cook.
Challenges and Solutions
The Snap Spectacles 24’ are a bleeding-edge product. We were very excited to work with this technology, but we faced a steep learning curve for development because the product was so new and unique. We really appreciated the Snap team’s willingness to give advice for our technical problems: this assistance was invaluable! We were ultimately successful in developing for the Snap AR platform, and really enjoyed the opportunity to learn this technology. Another challenge came from developing three front-end interfaces. We wanted our technology to be as usable as possible across AR, web, and mobile: to achieve this, we had to coordinate carefully between our different developers to ensure that all features were implemented in parallel and we maintained cross-compatibility—communication was essential throughout the project.
Accessibility
While much of our development focused on the Snap AR application, we recognize that the gesture-based, visual-only interface is not fully accessible. To mitigate this, we took steps to ensure accessibility throughout our user experience. Our iOS app and web UI both exceed WCAG accessibility guidelines for contrast, font size, and spacing. All of our interfaces are fully color-blind compatible, and both the app and website can be viewed with a screen reader.
Accomplishments that we’re proud of
We’re proud to have developed a Snapchat Lens AR interface within 24 hours of being introduced to the technology: it was very challenging at first, but as we put in more work we ultimately surprised ourselves with how much we were able to accomplish. Also, this was our first time building three front-end interfaces: we didn’t think we would have time to complete these, but we were able to get each to a working state.
What we learned
We learned the importance of communicating with each other. We also learned that it’s worth it to challenge yourself and set high expectations. None of us had used Lens Studio before, and we really enjoyed learning how to use this AR platform. Additionally, Asa had not worked in web design before, and had to learn how to put together a web frontend for the server.
Operationalization
In the future, we would love to deepen our integration with the entire kitchen process. This could include walking the user through complicated steps, providing tips and recommendations based on technique, and providing different options to customize recipes based on personal preference. In the long-term, we see this technology having applications to meal tracking and planning. For example, the system could recommend specific items for a grocery list, track past food purchases, and recommend how to combine leftovers to make delicious new meals.
Open-Source Technology Used
Axios, Node.JS, Flux Schnell, Express, Sharp image conversion, MySQL, Apple Developer Kit
Built With
- css3
- devkit
- flux
- gemini
- html5
- javascript
- kubernetes
- lens-studio
- node.js
- oracle
- spectacles
- typescript


Log in or sign up for Devpost to join the conversation.