Inspiration
Every ad platform tells you what happened — clicks, views, scroll depth. None of them tell you why. A viewer can watch a full 30-second ad while their brain checked out at second 12, and watch time calls that a success. We wanted to close that gap by making brain activity legible to the people who actually make ads.
What it does
Grey Matter takes predicted brain activation data from Meta's TRIBE V2 model and translates it into actionable reports for advertising professionals. Upload a video, and you get three layers of analysis: a Scorecard for performance marketers (attention score, peak moment, drop-off, recommended edit), an Emotional Arc for creative directors (narrative of the viewer's emotional journey), and a Timestep Timeline for content creators (second-by-second breakdown with engagement scores and feeling tags at each moment).
How we built it
We built a Next.js frontend with a streaming API route that pipes TRIBE V2 brain activation arrays — predicted neural responses across 15+ brain regions per timestep — through Claude Sonnet 4 API along with a brain region-to-function reference sheet and a structured prompt. The LLM interprets the raw neuroscience data and produces all three report layers in a single streamed response. The UI presents results in a tabbed interface so each user type sees the layer most relevant to them.
Challenges we ran into
TRIBE V2 is slow — a 5-second video takes roughly 20 minutes to process through the model. Translating dense activation arrays into language that feels natural and actionable without losing neuroscientific accuracy was a constant balancing act. We also had to design a single prompt architecture that produces three distinct output layers, each written for a different audience, in one pass.
Accomplishments that we're proud of
We turned raw brain region activation data into plain-language insights that a performance marketer can act on in seconds. The three-layer report system means one analysis serves three different roles without anyone having to filter through information meant for someone else.
What we learned
Brain data is only as useful as the interface around it. TRIBE V2 is genuinely powerful infrastructure, but its value was locked behind numerical arrays.
What's next for Grey Matter
Real-time processing as TRIBE V2 inference speeds improve, A/B comparison mode to stack two ad variants side by side at the neural level, and benchmark libraries so teams can compare their content against category baselines. Longer term, we want to move upstream — using brain predictions to guide creative decisions before production, not just evaluate them after.
Built With
- claudeagentsdk
- claudeapi
- next.js
- sonnet4
- tailwind.css
Log in or sign up for Devpost to join the conversation.