Inspiration
Harmony of the Deep was inspired by the idea that the ocean has its own unseen symphony—rhythms, migrations, relationships, mysteries, and brutal realities.
What it does
The film presents an underwater world awakened by the resonant call of whales. As the song travels through the water, it stirs light, color, and movement, inspiring harmony and connection among the creatures that inhabit the deep. It’s a short, lyrical journey centered on mood, beauty, and emotional resonance rather than dialogue or narrative exposition.
How we built it
All visuals were created using AI-generation tools including OpenArt, and Kling. Audio was generated through Suno and then refined in Adobe Audition. I assembled the final cut in Adobe Premiere, layering motion, pacing, and sound design to create a cohesive underwater experience. Color grading, rhythm, and timing were guided by the flow of the whale song itself.
Challenges we ran into
One of the biggest challenges was maintaining visual consistency across shots generated from different tools and generations. Underwater lighting, particle behavior, and creature anatomy vary widely across AI outputs, so I spent a significant amount of time timing (and retiming) clips, matching color palettes, and ensuring the world felt unified. Creating motion that felt organic and fluid was another major hurdle, especially in an environment where everything technically "floats" and drifts.
The hardest challenge, however, was AI lip-sync. Current lip-sync tools are trained almost exclusively on human facial structures—not humpback whales, not deep-sea creatures, and not alien marine life. Getting the whale vocalizations to visually match the movement of a non-human mouth required extensive manual adjustments, experimentation, and creative problem-solving to make the animation feel believable.
Accomplishments that we're proud of
I’m proud of how cohesive the final film feels despite being built from multiple AI sources. The underwater atmosphere—the bioluminescent textures, the drifting particles, the slow, immersive pacing—came together into a world that feels alive. The music and soundscape also play a huge role, and the harmony between audio and visuals is something I’m very happy with.
What we learned
This project reinforced how much AI-assisted filmmaking benefits from traditional film instincts: pacing, composition, thematic clarity, and emotional flow. I learned more about guiding AI models toward consistent environments, and how to push Suno’s musical output to match a specific emotional tone. It also highlighted the importance of manual curation—choosing the right shots matters far more than sheer quantity.
What's next for "Harmony of the Deep"
I’d love to expand the world of Harmony of the Deep into a longer underwater sequence or a series exploring different ecosystems through music-driven narratives. Future iterations may integrate more character-driven elements or explore the mythic possibilities of deep-sea lore. I’m also interested in developing interactive or looping underwater environments using these tools.
Built With
- adobe
- kling
- openart
- suno
Log in or sign up for Devpost to join the conversation.