Inspiration
We are currently overloaded with information to the point that most of our time goes into finding a source that is relevant to our context. Thus the information despite being available is still distant as it needs sorting through.
Imagine if we could customize a video that skips through the boring parts when you're in a hurry but tells a story keeping you company when you're patiently waiting for the bake to complete?
Oui-Chef aims to bring the content of cooking instructions closer to the viewer by generating a complete video from a combination of high quality recipes, while taking into to account user skill, available ingredients, user's attention span etc.
What it does
You tell what you want to cook, describe what you have on hand and how you would like to view it and it tailors a video to you!
How we built it
Used Aixplain with HeyGen to create a custom video generation pipeline.
Challenges we ran into
Generating custom video, stitching videos
Accomplishments that we're proud of
Developing something that works within 3 hours!
What we learned
agility and to ask someone for help
What's next for Oui-Chef
Lots, as the prototype is far from the vision, we have a long way ahead!
- Addition of attention span as a parameter.
- Actually generate sequences of cooking action (like sauteeing onions in a pan) and stitch them while ensuring the context remains the same.
- Use multimodal responses and dynamically change the instruction mid recipe (to account for you burning onions while sauteeing!)
Built With
- aixplain
- heygen
- python
Log in or sign up for Devpost to join the conversation.