Inspiration
I have 100+ saved cooking videos on Instagram. I've cooked maybe 5 of them.
The problem isn't motivation, it's friction. The recipe is buried inside a video. You have to rewatch it, pause every 3 seconds, squint at ingredients, and somehow follow along while your hands are covered in olive oil. It's a terrible experience, and every home cook knows it.
I wanted to build the thing I kept wishing existed: share a video, get a real recipe, and have something guide me through it without touching my phone.
What it does
ChopChop does two things really well:
1. Turns cooking videos into actual recipes. Share a TikTok, Instagram Reel, or YouTube video, AI watches it and pulls out ingredients, steps, cook times, difficulty, all of it. Works with photos too (snap a picture of a meal and it creates a reciepe).
2. Cooks with you. Hit "Start Cooking" and talk to it. the AI knows your recipe, your skill level, your dietary restrictions. Completely hands-free. No more greasy fingerprints on your screen. You can have a conversation with a Chef that lives in your pocket. It works in any language (just tell him what language you prefer)
On top of that, collections to organize recipes and auto-generated grocery lists
How I built it
The iOS app is SwiftUI + SwiftData. Designed a warm, earthy UI with a dedicated dark-mode cooking interface (because nobody wants a bright white screen blasting them at 9pm while they're making pasta).
When you share a video URL, it gets downloaded, then Gemini 2.5 Flash extracts the structured recipe data. For voice cooking, the backend hands off an ephemeral token and the app connects directly to OpenAI's Realtime API over WebSocket, with voice activity detection, echo cancellation, and tool calling so the AI can actually help you out in the kitchen.
Subscriptions are powered by RevenueCat. 3 free voice cooking sessions a month, Pro for unlimited. The paywall uses RevenueCatUI so I can tweak it from the dashboard without shipping an update.
Challenges I ran into
Every cooking video is different. There's no standard format. One creator lists ingredients in the caption, another rattles them off in 2 seconds, another never mentions quantities at all. Getting AI to reliably extract structured recipes across all these styles took a lot of prompt iteration and edge-case handling.
Kitchens are noisy. Sizzling pans, exhaust fans, running water, early versions kept thinking the stove was talking. Dialing in the voice activity detection threshold and adding echo cancellation made a huge difference.
What I learned
That the hardest part of building a voice interface isn't the technology, it's the timing. When to listen, when to respond, how long to wait. It's weirdly human-design stuff for something so technical.
Also most of my time on ChopChop wasn't spent writing code. It was spent cooking with it. Making dinner, getting annoyed at something, going back to fix it, then cooking again. The app got good when I stopped thinking like a developer and started thinking like someone just trying to make a meal.
That's also when I realized: the best apps don't add something to your life. They remove something. ChopChop doesn't give you more recipes. You already have a hundred saved. It just removes the wall between saving one and actually cooking it.
What's next
- Automatically create recipe in the background when the user saved a video on Instagram/Tiktok
- Meal planning with calendar integration
- Smarter grocery lists that merge ingredients across recipes
- Social Features, share recipes with friends, share results to social media
Log in or sign up for Devpost to join the conversation.