Inspiration
Behind every great ad is a feeling. Adonomics decodes that feeling - tracking joy, humor, and heartbeat moments, so brands know exactly why their stories stick, and how to make the next one unforgettable. We wanted to build a system that brings science to storytelling and helps creative, marketing, and strategy teams finally speak the same language through data.
What it does
Adonomics is an AI-powered insights and content management system that analyzes video ads to reveal what emotional and creative elements drive performance. Detects emotions, tone, color, pacing, and storytelling arcs from video ads. Maps these creative signals to performance metrics like brand favorability and purchase intent. Generates recommendations: what to amplify, what to fix, and what resonates with each audience segment. Acts as a unified workspace where marketers, creators, and managers can collaborate around shared insights.
How we built it
We built Adonomics as a Next.js full-stack application with a multi-stage AI pipeline: videos are uploaded to Twelve Labs API for scene-by-scene analysis (extracting emotional content, objects, pacing), then GPT-4o synthesizes these insights into actionable recommendations correlated with performance metrics. A key innovation is our intelligent user profiling system that customizes the dashboard and insights based on each user's background—technical users receive more granular data and methodology details, while creative-focused users get visual-first summaries tailored to their expertise. We also built a continuous live advertisement real-time analysis engine that monitors active campaigns across platforms (YouTube, Meta, TikTok), tracking ROAS, engagement, and creative performance with live dashboards and conversion funnels. We used MongoDB for storing analysis results, user profiles, and preferences, and implemented everything through Next.js API routes to keep the architecture simple. The biggest challenge was orchestrating multiple async AI services with different processing times, which we solved with a polling-based status tracking system.
What’s Next for Adonomics
Expand into multi-platform creative intelligence — analyzing TikTok, YouTube, and Instagram content. Build a recommendation engine that suggests the best creative style for each target audience. Develop an “Adonomics Score” — a predictive KPI to estimate campaign success pre-launch. Enable real-time creative feedback loops for content teams through an integrated workspace.
Challenges we ran into
Integrating multimodal outputs (text, vision, audio) into a unified emotion timeline. Quantifying abstract emotions like nostalgia or humor into measurable metrics. Balancing technical complexity with an interface that’s intuitive for non-technical marketers. Limited training data for culturally sensitive ad evaluation, requiring smart augmentation.
Accomplishments that we're proud of
Built an end-to-end multimodal pipeline that transforms subjective creative analysis into quantifiable insights. Delivered scene-level emotion tracking with real-time visualization. Created a collaborative insight layer connecting data analysts, content teams, and brand managers — improving communication across the funnel. Validated the concept with well-known ads showing measurable prediction differences.
What we learned
Emotional resonance can be engineered — creativity and data are not opposites. AI can act as a creative mentor, helping teams iterate faster and smarter. Collaboration between business and technical roles is key to making insights usable and impactful.
What's next for Adonomics
Expand into multi-platform creative intelligence — analyzing TikTok, YouTube, and Instagram content. Build a recommendation engine that suggests the best creative style for each target audience. Develop an “Adonomics Score” — a predictive KPI to estimate campaign success pre-launch. Enable real-time creative feedback loops for content teams through an integrated workspace.
Built With
- next.js



Log in or sign up for Devpost to join the conversation.