About the Project

Inspiration

I wanted to see how far style transfer could go when paired with new animation models. I had used NanoBanana before, but it always struggled with style transfer. The moment I tried the same workflow with Flux Kontext, the results were night and day. That contrast is what pushed me to build this project.

What I Learned

  • Style enforcement can make or break an animation workflow
  • Flux Kontext holds a style consistently across frames
  • Combining style transfer with WAN 2.2 Animate makes it possible to move beyond still images into full sequences

How I Built It

  1. Picked a Claymation LoRa as the test style
  2. Applied it to a reference image using Flux Kontext
  3. Compared against NanoBanana which failed to keep the style
  4. Connected the output with WAN 2.2 Animate, using a driving video to guide the motion while keeping the Claymation look

Challenges

  • NanoBanana and Seedream Edit could not enforce the style no matter how I tuned it
  • Keeping motion clean without breaking the style took a lot of trial and error
  • Running multiple style tests required more compute than expected

Why It Matters

Three years ago, style transfer with motion felt out of reach. Now, Flux Kontext with WAN 2.2 Animate makes it practical. It means we can test endless styles, from Claymation to anime, and actually trust the system to hold the look.

Built With

  • glif.app
Share this project:

Updates