Inspiration
I’ve always wanted a J.A.R.V.I.S.-like assistant, but my context today is fragmented across Gemini, ChatGPT, and Claude. This “context amnesia” forces constant repetition. I built Cortex Protocol to solve this—a universal memory layer that acts as a persistent, portable “SSD” for my digital life, while giving me full control over my data.
What It Does
Cortex Protocol is an open-source Model Context Protocol (MCP) Server that works as a shared brain for AI agents. It uses a “Biological Funnel” with three layers: Hot memory (24-hour recall), Warm memory (monthly summaries), and Cold memory (permanent Knowledge Graph). A memory-decay system ensures irrelevant data fades, and a 3D Glass Brain dashboard visualizes the AI’s memory.
How I Built It
The project was 98% vibecoded using Antigravity. Gemini 3 Pro (1M context) powers “The Dreamer” for nightly memory consolidation, while Gemini 3 Flash handles fast, cost-efficient real-time memorize/recall. The stack includes Python FastAPI on Google Cloud Run, Firebase (Auth and Firestore Vector Search), and Next.js on Vercel.
Challenges & Accomplishments
Balancing a full-time job with the hackathon was intense. Improving retrieval for vague queries like “What else?” required an upgrade-query rewriting step. Major wins include the “Magic Handoff” to Cursor IDE, independent Dreamer services, and the Glass Brain.
Learnings & What’s Next
I learned memory needs curation, privacy needs granular control, and MCP state management is critical. Next, I aim to scale Cortex to millions through production polish, deeper integrations (including ChatGPT Apps), local-first on-device PII redaction, encrypted user-owned storage, and a smarter Biological Funnel.
Built With
- ai-studio
- docker
- firebase
- gemini
- gemini-3
- google-cloud
- google-cloud-run
- nextjs
- python
- typescript
- vercel


Log in or sign up for Devpost to join the conversation.