In today’s fast-paced digital landscape, creative teams are under constant pressure to produce fresh, high-performing ads across multiple platforms — YouTube, Meta, Facebook, TikTok, Reddit, and Pinterest. Yet, most of their time is spent on repetitive work — tagging scenes, testing variations, and analyzing results — instead of ideating new creative concepts. We realized the advertising world lacked a true AI-powered creative partner — one that could: Understand video content like a creative director, Learn from what’s already working across social platforms, and Generate new ad ideas optimized for performance. That’s how AdBuddy was born — a unified AI Creative Intelligence Platform that helps teams create, test, and analyze ads across all major platforms, powered by Twelve Labs, Fal API, and social media intelligence.
AdBuddy is your AI Creative Co-pilot for advertising. It can: 🎬 Analyze video ads using Twelve Labs’ multimodal AI — detecting scenes, emotions, visual aesthetics, and brand moments. 🔍 Fetch and compare similar ads across platforms using YouTube, Meta, TikTok, Reddit, and Pinterest APIs, providing real-world performance data. 💡 Generate new ad concepts using Fal’s video generation API — simply describe your idea in a prompt, and AdBuddy creates ready-to-test variations. 📈 Predict and benchmark performance by comparing your ad’s emotional and narrative structure with top-performing campaigns. 🧭 Deliver creative insights on what works, what doesn’t, and why — helping marketers iterate intelligently. In short: AdBuddy helps you create smarter ads, faster.
How we built it We designed AdBuddy as a full-stack system integrating multiple layers of AI, data, and creative intelligence. 🧩 Architecture Overview Frontend (TypeScript + React + Tailwind) Built in TypeScript for strong typing and modularity. Dynamic dashboard for uploading, analyzing, and visualizing ad insights. Emotion timeline graphs, brand highlight overlays, and AI-driven recommendations. Backend (Python + FastAPI + PostgreSQL) FastAPI handles requests, orchestration, and async job scheduling. PostgreSQL stores ad metadata, embeddings, insights, and user sessions. Integration with Twelve Labs, Fal API, and platform APIs. ⚙️ Core Pipeline Video Ingestion User uploads or imports an ad. Twelve Labs’ index API extracts multimodal embeddings (visual, audio, and text). Analysis We query Twelve Labs’ moment and search endpoints to find emotional beats, logo appearances, and story transitions. Transcript data is combined with embeddings for narrative analysis. Social Media Intelligence Using APIs from YouTube, Meta, TikTok, Reddit, and Pinterest, AdBuddy fetches similar ads and their public performance metrics (likes, CTR, watch rate). These are embedded and compared for creative benchmarking. Creative Generation The Fal API transforms user prompts (like “A 15-second energetic tech ad with upbeat music”) into new video ad concepts. These generated ads are automatically analyzed again by Twelve Labs for emotion and tone alignment. Dashboard Visualization Frontend displays emotional arcs, brand visibility, performance comparison, and creative suggestions interactively.
Challenges we ran into 🎨 Video Generation Prompting: Tuning Fal API prompts to produce coherent, visually appealing ad clips was a learning curve. 💾 Performance & Latency: Handling large uploads, indexing via Twelve Labs, and fetching live platform data — all while keeping the UI responsive — required optimized async design. ⏱️ Hackathon Constraints: Integrating three APIs (Twelve Labs, Fal, and social media) within 48 hours was challenging but rewarding.
Accomplishments that we're proud of 🎉 Built an end-to-end system that can ingest, analyze, and generate ads automatically. 🧠 Leveraged Twelve Labs for deep video understanding — detecting emotions, brand presence, and narrative arcs. 🎬 Integrated Fal API to create video content directly from text prompts. 🔍 Pulled live social data from multiple APIs to benchmark and learn from successful ads. 📊 Delivered a beautiful, interactive TypeScript dashboard showing emotional timelines and creative insights. ⚡ Enabled marketers to go from concept to creative intelligence in minutes, not hours.
What we learned 🌐 APIs can be creative building blocks — combining Twelve Labs (analysis), Fal (generation), and social APIs (benchmarking) gave us a full creative loop. 🎥 Video embeddings are a powerful abstraction — they let us measure “creative similarity” mathematically. 🤖 Prompt engineering for creativity is both art and science — small wording changes in Fal prompts drastically affected video quality. 💡 Cross-platform optimization matters — what works on TikTok may fail on YouTube; our insights engine exposed these nuances. 🧠 AI doesn’t replace creativity — it supercharges it.
What's next for AdBuddy Performance Prediction Models Train regression models to estimate engagement or conversion before publishing. Brand-Specific Fine-Tuning Use custom data to align emotion and tone with brand voice. Integration with Creative Suites Connect to Adobe Premiere, Canva, and CapCut for seamless editing and publishing. Collaboration Tools Shared dashboards, annotation layers, and creative review workflows. Explainable Creative Intelligence Visualize why an ad performs better (scene pacing, emotional structure, visual style). “Creative DNA” Mapping Build unique brand embeddings that describe a company’s storytelling fingerprint over time. Our long-term vision is to make AdBuddy the go-to AI creative director — helping every brand tell its story faster, smarter, and with data-driven emotion. 💙
Built With
- fal
- postgresql
- twelvelabs
- typescript
Log in or sign up for Devpost to join the conversation.