Inspiration
Mental health apps tell you what to do — breathe, journal, meditate. But they have no idea what's actually happening in your brain when you're struggling. We were inspired by Meta's TRIBE v2 research, which demonstrated that a transformer model trained on fMRI data can predict which brain regions activate in response to natural language — in real time.
We asked: what if a therapist AI could see your brain while you talked to it?
What It Does
MindFeed is a brain-aware AI mental health companion. When you type how you're feeling, it:
- Simulates TRIBE v2 — maps your words to predicted fMRI activations across 16 key brain regions (amygdala, prefrontal cortex, hippocampus, etc.)
- Renders a live 3D brain — a real anatomical GLB model with an fMRI-style heatmap showing which regions are "lighting up"
- Responds therapeutically — Groq Cloud (Llama 3.3 70B) generates responses calibrated to your neural state, not just your words. High amygdala + low PFC? It grounds you. High precuneus? It engages your narrative.
- Protects your regulated state — after each session, a curated "brain-safe" Bluesky feed is generated to keep you out of dysregulation
How We Built It
- React + Vite for the frontend
- React Three Fiber + Three.js for the 3D brain visualization with real-time vertex-colored activation heatmaps
- Custom TRIBE v2 Simulator built from scratch in JavaScript — an emotion lexicon + region weight matrix that mirrors the architecture described in Meta's TRIBE v2 paper
- Groq Cloud (Llama 3.3-70b-versatile) for streaming therapeutic AI responses with neural-state-aware system prompts
- Real anatomical brain GLB model with auto-normalized bounding box scaling, organic lighting, and activation hotspot overlays
Challenges We Faced
- Brain geometry — building a convincing procedural brain (before we sourced a real model) required multi-octave FBM noise, proper ellipsoid proportions, hemispheric fissures, temporal lobe bulges, and separate cerebellum/brainstem meshes
- Neural mapping — designing a TRIBE v2 lexicon that meaningfully maps language to brain regions (not just sentiment) required deep reading of the neuroscience literature
- Therapeutic calibration — engineering system prompts that actually change behavior based on neural state (rather than just mentioning the brain state) took significant iteration
- Real-time vertex coloring — mapping activations onto 160×120 sphere vertices with distance-falloff weighting, per frame, without tanks in performance
What We Learned
That neuroscience + AI + 3D visualization is an incredibly powerful combination for mental health. The brain heatmap makes abstract emotional states visible in a way that feels both scientific and deeply human.
Log in or sign up for Devpost to join the conversation.