Inspiration
We were 5 devs in a hackathon, each using a different AI assistant — Gemini, Claude, GPT. Within the first hour, two of us had implemented the same Supabase Edge Function with different approaches. The conflict didn't surface until git merge. We realized: if AI assistants are going to write most of the code, they need to see each other. That's when we stopped
building our original idea and started building the tool we actually needed.

What it does HiveMind wraps any AI CLI (Gemini, Claude, etc.) in a transparent PTY layer that captures every prompt and response without changing your workflow. It detects semantic collisions before they reach git — using 768-dimensional embeddings and cosine similarity — and alerts both developers in real time. A central dashboard (the Watchtower) shows who's working on what, a live activity feed, and a synced Trello board that AIs read, update, and organize autonomously. Plugins for IntelliJ and VS Code bring collision warnings inline into your editor. The AIs coordinate through a shared board — no human intervention needed.

How we built it

  • CLI wrapper: Node.js + @lydell/node-pty for real PTY capture, TypeScript, MCP SDK for exposing 7 tools to Gemini
  • Backend: Supabase (PostgreSQL + pgvector + Realtime + Edge Functions in Deno). 17 Edge Functions, 14 SQL migrations
  • Watchtower: React 19 + Vite 8 + Tailwind CSS 4 PWA, deployed on GitHub Pages
  • IDE plugins: Kotlin (IntelliJ Platform SDK) + TypeScript (VS Code Extension API), both with embedded Watchtower and
    inline collision decorations
  • Semantic search: OpenAI text-embedding-004 embeddings stored in pgvector with HNSW indexing
  • Coordination: Claude Code + Gemini CLI as our own dev tools, dogfooding HiveMind to build HiveMind

Challenges we ran

  • PTY capture on all platforms: The classic node-pty doesn't compile on Node 25. We switched to @lydell/node-pty and built Windows support with cmd.exe /c wrappers for .cmd scripts.
  • Supabase Realtime + RLS: Row Level Security policies silently blocked Realtime events. We spent hours debugging why the
    Watchtower showed "connected" but received no data — the WebSocket was authenticated but the policies filtered everything
    out.
  • Credential exposure: We accidentally committed API keys to the repo. We had to rewrite the entire git history with
    git-filter-repo across 172 commits, rebuild all release artifacts, and force-push — all at 3 AM.
  • 5 devs, 5 AI assistants, 1 repo: Exactly the problem we were solving became our biggest challenge. We had real collisions while building the collision detector.

Accomplishments that we're proud

  • Zero workflow change: hivemind run wraps your existing CLI. Colors, autocomplete, shortcuts — everything works. The
    developer doesn't know HiveMind is there.
  • AIs that self-organize: Through the MCP tools, Gemini reads Trello cards, moves them between columns, posts comments, and blocks itself when another agent is touching the same file. No human told it to.
  • Cross-platform from day one: CLI, IntelliJ plugin, VS Code extension, and PWA — all working on macOS, Windows, and Linux.
  • We dogfooded it: The last 12 hours of the hackathon, we used HiveMind to build HiveMind. It caught 3 real collisions.

What we learned

  • Realtime is harder than it looks: WebSocket connections, JWT expiration, RLS policies, REPLICA IDENTITY — each one is a
    silent failure point. Polling fallbacks are essential.
  • AI coordination is a new problem space: There's no established pattern for making multiple AI agents aware of each other. We had to invent the protocol.
  • The methodology became the product: We started with a process (shared Trello board, Slack alerts, manual coordination) and realized the process itself was the product. The Watchtower was born from our own dashboard. The intents came from our own workflow.

What's next for HiveMind

  • Semantic collision v2: Move from cosine similarity to LLM-powered conflict analysis that understands code intent, not
    just text similarity
  • Self-hosted mode: Full docker compose up with local Supabase, no cloud dependency
  • Agent marketplace: Let teams define custom MCP tools that all their AIs share — linters, deploy scripts, code review bots
  • Auto-resolution: Instead of just detecting collisions, let the AIs negotiate and merge their approaches autonomously
  • Support for every AI CLI: Cursor, Copilot, Windsurf, Aider — any tool that runs in a terminal

Built With

Share this project:

Updates