Inspiration
Our inspiration came from the NFL and seeing how time consuming it was to highlight reels. We wanted to leverage AI to increase efficiency and productivity.
What it does
We created an AI-powered video generation system that can produce compelling highlight reels for the 2024 Summer Olympics using natural language prompts.
How we built it
We used next.js, Remotion, Firebase, and Twelve Labs to produce the highlight reels and script.
Challenges we ran into
We ran into network connectivity issues and API rate-limiting.
Accomplishments that we're proud of
We're proud of the seamless integration of script, voice-over, and visuals.
What we learned
We learned about Twelve Labs' Marengo video foundation model for visual analysis and metadata generation, Pegasus-1 model for script generation and video assembly, and the open-source video agent Jockey for conversational video editing.
What's next for Gold Metal Clips
Some future implementations include adding support for multiple languages, granular editing of the highlight reels, and exporting the videos in different formats.
Built With
- firebase
- marengo
- nextjs
- openai
- pegasus
- remotion


Log in or sign up for Devpost to join the conversation.