Inspiration

We were inspired by how agent-based simulations can predict human behavior before launching real campaigns. Most marketers still rely on intuition or A/B testing after spending money. We wanted to change that by letting brands see how different groups of people would react to their ad creatives before going live.

What it does

AdVisor lets marketers upload any ad (image or video) and uses multimodal AI to break it down by tone, pacing, CTA, sentiment, and more. It then routes the ad into a set of AI agents, each representing a different buyer society like “Time-Starved Parents” or “Bio-Optimization Enthusiasts.” Those agents simulate realistic emotional and rational reactions, highlight weak spots in messaging, and generate better ad variants tailored for each audience.

How we built it

We split the system into three major components: Feature Extraction (Matthew): Built a multimodal pipeline combining CLIP embeddings, sentiment models, and text parsers to extract signals like visual tone, logo prominence, and CTA urgency. Agent Simulation (Sharan): Designed generative agents that embody audience archetypes and return qualitative feedback, objections, and variant ideas. Society Selection (Yashas): Created the routing engine that interprets the ad’s extracted signals and picks which societies to simulate based on brand metadata and feature alignment.

We built the frontend in Next.js and React, inspired by the clean UX of societies.io, and the backend in Python and FastAPI with a shared JSON schema for features, societies, and agent outputs.

Challenges we ran into

Balancing creativity with structure so agents speak authentically while staying interpretable. Ensuring feature extraction worked across image and video formats without breaking processing time limits. Designing transparent scoring logic to explain why a certain society was selected. Integrating many moving parts under five minutes of total runtime per batch of ads.

Accomplishments that we're proud of

Built an end-to-end working demo that uploads a creative, extracts features, selects societies, and shows agent feedback in under five minutes. Created an interpretable society routing system that bridges raw data and human-like feedback. Delivered a frontend experience that feels like testing your ad inside an artificial world of real consumers.

What we learned

We learned that large language models can model collective behavior when structured as agent networks, not just single responses. We also saw the power of combining symbolic logic (feature scoring) with generative reasoning (agent reactions) to produce feedback that feels both measurable and human.

What's next for AdVisor

Add social influence modeling where agents within a society affect each other’s opinions to simulate virality. Integrate Chroma for vector-based retrieval of past ad-society mappings. Connect with Fetch.ai for decentralized agent hosting and Bright Data for real audience enrichment. Expand the Insights Dashboard to show cross-society resonance maps and creative iteration history.

AdVisor aims to become a full creative intelligence platform where every marketer can test, refine, and evolve their ads through living AI societies before spending a single dollar.

Built With

Share this project:

Updates