About the Project
Inspiration
I wanted to see how far style transfer could go when paired with new animation models. I had used NanoBanana before, but it always struggled with style transfer. The moment I tried the same workflow with Flux Kontext, the results were night and day. That contrast is what pushed me to build this project.
What I Learned
- Style enforcement can make or break an animation workflow
- Flux Kontext holds a style consistently across frames
- Combining style transfer with WAN 2.2 Animate makes it possible to move beyond still images into full sequences
How I Built It
- Picked a Claymation LoRa as the test style
- Applied it to a reference image using Flux Kontext
- Compared against NanoBanana which failed to keep the style
- Connected the output with WAN 2.2 Animate, using a driving video to guide the motion while keeping the Claymation look
Challenges
- NanoBanana and Seedream Edit could not enforce the style no matter how I tuned it
- Keeping motion clean without breaking the style took a lot of trial and error
- Running multiple style tests required more compute than expected
Why It Matters
Three years ago, style transfer with motion felt out of reach. Now, Flux Kontext with WAN 2.2 Animate makes it practical. It means we can test endless styles, from Claymation to anime, and actually trust the system to hold the look.
Built With
- glif.app
Log in or sign up for Devpost to join the conversation.