Inspiration

It all started with an idea of someone shooting at the moon. I asked myself, why would anyone do that? Each answer led to another question, and the story of "Eve and Adam" began to unfold. I wanted to capture the nostalgia of films like Close Encounters of the Third Kind and Escape to Witch Mountain, so I used Gemini to research the equipment those films were shot on and to help craft prompts that would maintain that classic look. I used Veo3 text-to-video and image-to-video for the visuals.

How I built it

This project marked my first experience with JSON prompting. While I skipped the social media debate about whether JSON prompting gives better results —after three years of crafting natural language prompts in paragraph form, the structure this format provided simply made sense to my brain.

First, I described the shot I wanted to create to Gemini, which then gave me a prompt in JSON format that I would feed into Veo3. I'd run an initial generation without even reading the prompt, just to see the output. Once I saw how far off the initial generation was from my vision, I'd then read the prompt and find the exact category I needed to tweak—be it lighting, composition, or the subject itself—thanks to the prompt's structured format. To me, this was far easier than revising a block of natural language text.

Another advantage: I could copy and paste the entire JSON prompt into Veo3 due to its large token limit, whereas other platforms would hit their character count maximum. I did this shot by shot, in a linear fashion. I added clips to my timeline in Premiere until the entire story was roughly laid out. On this first pass, the quality of the shots wasn't the priority; I just needed placeholders to begin understanding the film's flow, pacing, and tone. Then I went back to the start of the timeline and regenerated and refined each shot. I did this hundreds of times.

Challenges we ran into

I used Runway Aleph to fix inconsistencies in certain shots, as well as change lighting from day to night featuring the same characters. It came out right as I was nearing the end of making this, so I definitely need to spend more time with it, but this kind of natural language in-painting is amazing.

Veo3 provides an 8-second clip with sound effects, dialogue, and sometimes even baked-in music. For better sound design and mixing control, we need to export individual stems, so I used Lalal to split the audio.

Accomplishments that we're proud of

Veo3’s audio often reflects the scene’s tone—for example, tense scenes get tense music—but it cuts off abruptly at the 8-second mark. One trick I used was to upload the 8 second clip’s audio to Suno and use it as inspiration for a longer sound design or music track. This gave me better better audio consistency when cutting between shots in a particular scene.

The main theme music is an original song I wrote years ago when I was in a band. I uploaded the recording to Suno and created variations in different styles that serve as Adam's theme song. I love how this tech isn't just helping me visualize new ideas; it's also breathing new life into old ones.

What we learned

Veo3 is amazing for baked in dialogue with video, but the actors tend to overact. If we truly want to fulfill the job of an "AI director", then we need to be given more control over the AI actors.

What's next for Eve & Adam

Eve & Adam has been picked up for development at Promise Studios. I am currently working on adapting it as a feature film.

Built With

Share this project:

Updates