Inspiration

Behind the Scenes – The UnderDogs started long before I opened any AI tool — it began with a single picture of my doggies.

I found an old photo of myself and casted my child-self as a kid in a fur-lined jacket, the same age as Niko in the story. Something about that picture held a kind of stubborn, hopeful energy — the version of me who believed he could do anything.

Around the same time, I took a photo of the three Pomskies I was dogsitting: Odin (the big one with the brown left eye), Sofi (the darker one with all-blue eyes), and Lotus (tiny, white, and fearless). They looked like a miniature, chaotic sled team — barely coordinated, but bursting with personality.

That collision — childhood courage + three ridiculous Pomskies — lit the fuse.

I’ve always loved sled-dog stories. Movies about the Alaskan frontier were a core part of my childhood. They always had the same themes: grit, loyalty, nature at its most dangerous and beautiful. So the idea formed:

What if a tiny Pomsky team took on a legendary 500-mile Alaskan race? What if no one believed in them? And what if they didn’t care?

The more I thought about it, the more personal it became. This BTS project is the story of how that idea turned into film.

What it does

The BTS version documents:

How the world of The UnderDogs was created shot by shot

How the AI tools were stitched together into a single cinematic style

How I kept the designs of three dogs consistent across dozens of scenes

How I built the avalanche sequence, which became the technical heart of the project

How Niko, the boy, became a believable emotional anchor rather than a “generated face”

This project shows the making-of process — not just the film, but the experimentation, the failures, the discoveries, the improvisation, and the behind-the-curtain mechanics of building a cohesive short film using multiple AI tools.

It reveals the “real filmmaking” parts: blocking, continuity, color control, shot language, sound layering, pacing, and character-driven storytelling.

How we built it

This was built like a real production pipeline — just with AI instead of cameras.

  1. Script → Shotlist → JSON Sequences

I wrote the story first, then converted each scene into a JSON-based shot plan so I could control:

focal lengths, dolly vs crane vs tracking motions, environmental rules (fog, snow, light angle), character continuity, emotional beats, pacing and timing

This structure acted like a storyboard + director’s notebook fused together. However as I tried to be more deterministic I realized it was best for me to improvise and become more probabilistic and work with what I had instead of forcing the shots that I wanted.

  1. Character & Dog Consistency

The dogs were the most unstable part of the entire pipeline. AI loves to “improve” characters, especially animals.

So I locked down:

Odin’s heterochromia (right eye blue, left eye brown). Sofi’s darker multitone fur + blue eyes. Lotus’s white coat + smaller frame (AI constantly tried to turn him into a wolf), harness colors, their general proportions next to the sled. This required reference images pinned into every generation.

  1. Environments & Cinematics

Each tool served a specific purpose:

RunwayML — avalanche physics, heavy action, big landscape motion

Veo 3 — tracking shots, aerials, dolly-ins, cinematic movement

Midjourney — key stills, world-building, concept art, emotional close-ups

Suno — music, rumble textures, heartbeat cues

ElevenLabs — narration + sound effects like breathing, sled friction, snow crunch

Premiere Pro — final assembly, grading, timing, transitions, compositing

I created:

snow atmosphere overlays using Runway's Adelph App

environment continuity (sky color, snow scatter, shadow length)

All of this kept the visuals cohesive across wildly different AI tools.

  1. Avalanche Engineering

This sequence required:

Multiple takes using VEO3 to get it right

The biggest challenge: making it huge without losing clarity.

I treated it like a real VFX simulation, tuning details scene by scene.

  1. Sound & Atmosphere

This was critical for realism.

I layered:

heartbeat rhythms that match Niko’s emotional state, low-frequency avalanche rumble, wind hiss, frost crackling, sled runner friction, breath clouds, dog panting + barks, piano-based orchestral rises

Every sound cue had to feel physical — like the camera was actually there.

  1. Editing + Compositing

In Premiere, I stitched all shots together with:

speed adjustments, cross-fades, snow overlays, color correction, lens blur, match color across tools, tiny timing tweaks to hold tension

This part took the most traditional “filmmaker” work.

Challenges we ran into

  1. Character Drift

Every new shot risked the dogs morphing into:

wolves, foxes, Alaskan malamutes, random fantasy creatures

Fixing this became a full-time job.

  1. Emotional Subtlety

AI doesn’t naturally produce micro-expressions like:

fear in the eyes, breath trembling, a kid waking up slowly, dogs communicating with glances

  1. Avalanche Scale

I wanted the avalanche to be the biggest ever put in an AI film.

Balancing:

scale, detail, readability, physics, dust cloud density, sunlight direction, was extremely difficult.

  1. Tool Mismatch

Every model speaks a different “visual language.”

Unifying them required:

masks, grading, reframing, layering, reference locking

It took a ton of trial and error.

Accomplishments that we're proud of

An almost cohesive film made from multiple AI tools, stitched into one visual identity

Stable character designs across dozens of shots

A massive avalanche sequence that feels cinematic and physically believable

A consistent emotional tone, despite using generative tools

A working JSON shot-by-shot pipeline that behaves like real film direction

Soundscapes that feel tactile, not synthetic

Building something that feels human, not “AI-ish”

And honestly — proving that three Pomskies can feel epic was extremely satisfying.

What we learned

AI is powerful, but story is still a human job.

Structure and constraints are everything.

Emotional beats matter more than spectacle.

Consistency requires obsessive detail.

Tools are only as good as the direction behind them.

Improvisation — “vibe-coding” — works surprisingly well for generative filmmaking.

The future of film may be hybrid: human vision + AI-enhanced execution.

And most importantly:

Underdog stories resonate for a reason. We all want to believe something small can still win.

What's next for Behind the Scenes – The UnderDogs

I want to turn this into something bigger.

  1. A live-action version

I’d love to raise a small budget, find sponsors, or run a crowdfund to:

fly Odin, Sofi, and Lotus to Alaska

film real sled footage, mix it with AI-enhanced cinematics, shoot practical avalanche plates, capture real northern lights, blend real snow with AI worlds

  1. A longer short or featurette

Expand the story, deepen Niko’s arc, add rival mushers, more wilderness danger.

  1. A full on-location BTS documentary

Show all the AI + real-life filmmaking work together.

  1. Film festival submissions

Take this beyond the competition.

  1. Build an “AI Filmmaker Pipeline”

Share:

templates, shot structures, LUT packs, environmental rules, continuity workflows, so other creators can build their own stories.

Built With

Share this project:

Updates