Inspiration
Every home cook knows the struggle: you find a great recipe in a cookbook, on a friend's fridge, or in a magazine, and now you're juggling your phone, a messy counter, and flour-covered hands trying to follow along. PocketShelf started because I wanted one app that could take any recipe from anywhere and turn it into something I could actually cook with.
What it does
Point your camera at any recipe. PocketShelf uses AI to extract the full recipe (title, ingredients, steps, timers) and organizes it on your shelf. When you're ready to cook, it walks you through each step with voice guidance so you never have to touch your phone. Smart countdown timers kick in automatically when a step involves waiting. Need to adjust servings? Slide the scaler and all ingredients update.
How we built it
Built natively in Swift and SwiftUI. Apple's Vision framework handles text recognition from photos, then an AI processing layer parses unstructured recipe text into structured data. AVSpeechSynthesizer powers the voice guidance in cooking mode. RevenueCat SDK handles subscriptions and the paywall. Whole thing came together in about two weeks of focused building.
Challenges we ran into
Recipe layouts are wild. Two-column cookbooks, sidebars with tips mixed into instructions, ingredients split across pages. Getting the AI to reliably parse all that into clean structured data took way more iteration than expected. Also, tuning AVSpeechSynthesizer to not sound robotic when reading cooking instructions ("sauté" pronunciation was a fun one).
What we learned
Building for "hands are dirty" UX is a completely different design constraint than normal app design. Also, RevenueCat's SDK is genuinely well-designed. Went from zero to working paywall in an afternoon.
What's next
Instacart integration so you can go from photo to grocery delivery. Support for more recipe sources like URLs and PDFs. Social sharing so you can send digitized recipes to friends.
Log in or sign up for Devpost to join the conversation.