Inspiration
Looking at the Sky was born from a personal need to break the boundary between imagination and reality. It became my first short film created with AI, developed right after taking part in Nouvelle Bug Vol. 3, an experience that pushed me to experiment freely. For the first time, I could give a face to my stories instead of leaving them trapped in my head.
What it does
Looking at the Sky is both a story and a test lab. On one side, it follows a murdered girl stuck between worlds, chased by a soul-hunter while trying to be heard by a creature that doesn’t belong to any realm. On the other, it shows how AI can be used as a directing tool, not a replacement for storytelling.
The project:
Experiments with AI-generated imagery as emotional language, not just spectacle. Uses the “in-between” space to talk about trauma, unfinished stories, and the fear of disappearing. Demonstrates how a single creator can build a fully realized, atmospheric short with minimal resources by treating AI as a collaborator in texture, not in intention.
How we built it
Challenges we ran into
Achieving visual consistency across different AI tools. Translating human emotion into outputs generated by systems that don’t feel. Keeping a clear narrative thread inside an inherently chaotic medium. Making sure the story remained the heart of the project, not the technology.
Accomplishments that we're proud of
The process was hybrid and experimental: Concept — A girl wakes up in the afterlife after being murdered, trapped between worlds. She is hunted by a soul-collector while an interdimensional creature listens to her story. Visual Exploration — I tested multiple tools to find the right textures, colors, and emotional language for a world stuck between life and death. Assembly — I edited the sequence like fragmented memory: poetic, disjointed, and slightly distorted. Voice & Tone — I shaped the narration as a quiet fight to return home, intimate, unresolved, and desperate to be believed.
What we learned
Working with AI taught me that creativity is not linear, it’s iterative. I learned to: Blend texture, atmosphere, and narrative into a unified pipeline. “Direct” models that don’t have intention, only response. Trust my creative instincts while refining outputs with precision (not always necessary, I like it without it).
What's next for Looking at the Sky
Looking at the Sky is a starting point, not an endpoint. Next, I want to: Refine the cut with stronger sound design and pacing, pushing the feeling of being “stuck between realms.” Explore a longer version or anthology format, where other souls in the in-between share their stories with different entities. Use what I learned here to build a hybrid project: part AI-generated, part live-action, keeping the same emotional core. Submit the short to festivals and experimental showcases that are open to AI-assisted filmmaking, and keep iterating based on audience feedback. The goal is simple: keep using AI to unlock scale and possibilities, without losing the most important part, a human story that wants to be seen.
Built With
- artlist
- elevenlabs
- freepik
- kling
- midjourney
- runway

Log in or sign up for Devpost to join the conversation.