Inspiration and The Problem: I built FoodLens and all of its systems from scratch over the last few weeks because I felt like too many Mixed Reality apps are adding to your space without using it and without a real use case. I have attended two Meta hackathons before in San Francisco and New York without winning (runner-up) but those experiences inspired me to finally tackle this complex idea, now that the Quest 3 hardware is ready and with the evolving AR market (Rayban Displays). Cooking felt like the perfect challenge to solve because everyone does it, and the app would require a hands-free interface and real-time spatial data which is something phones and tablets cannot provide effectively.

What The App Does: FoodLens is a bridge between advanced AI and your physical kitchen. It does not just display text but sees what you are doing. You can look at a pan of frying chicken and simply ask the AI if it looks cooked enough and the system analyzes the visual texture of the crust to provide a second opinion. You can also scan every ingredient in your pantry or fridge in under thirty seconds with extreme accuracy and then ask what you can make with the items you have on hand.

How I Built It: I chose Unreal Engine 5.5 for its superior rendering capabilities, although existing engine tools could not access the raw camera. To solve this I engineered a custom high-performance architecture from the ground up. I wrote a proprietary C++ plugin that interfaces directly with the Android Vulkan layer which bypasses standard media frameworks and allows the app to capture hardware camera frames instantly for analysis without frame drops.

For the brain of the app, I architected a complex custom backend that manages multi-modal context. I developed a unique computer vision algorithm, that utilizes multi-view triangulation through headset vector and rotation telemetry to spatially track individual items in 3D space and much more. This system builds a persistent spatial map of the kitchen, ensuring it never duplicate-scans the same object and only updates data when a superior viewing angle is captured. To ensure stability, I engineered atomic synchronization to align the high-speed Camera Hardware Thread with the Game Thread, eliminating the race conditions common in high-performance MR apps.

Future Plans: FoodLens is targeting a full release soon. I would like to build a Pantry Gamification feature that uses the ingredient scan system and generates zero-waste unique recipes using your niche ingredients. I also plan to launch an AI-Curated Cookbook which will be a community platform where the AI learns from your feedback to model your specific taste preferences and tailor suggestions. I would like to also release a version of this on the Meta Rayban Displays. Finally I plan to release my custom Vulkan camera architecture as a plugin to help other Unreal developers.

Cheers!

Built With

Share this project:

Updates