Inspiration
We were tired of hallucinated citations and black-box LLM answers in research. We wanted to build something that actually proves where every piece of information comes from in real time.
What it does
Cortex lets users upload research papers and then interact with them. Every response the AI gives is citation-verified; users can instantly see the exact source paper and page a statement came from.
How we built it
We used an MCP server as the backbone to connect our Gemini LLM to a document database, embedding model, and agentic pipeline. Everything runs through a custom RAG system optimized for chunk-level traceability.
Challenges we ran into
Making chunk-level verification actually work. Integrating everything cleanly in under 36 hours with minimal hallucination.
Accomplishments that we're proud of
We built an agentic MCP server with real citation verification in a single weekend. We pulled off an explainable pipeline where every output is grounded in actual text.
What we learned
Agentic systems need structural and communication protocols. Real-time citation verification is the missing piece for trustworthy LLMs.
What's next for CORTEX
Expand citation verification down to sentence-level with confidence scores. Add consensus maps to visualize agreement across multiple studies. Deploy as a tool for scientists to interactively verify AI reasoning.

Log in or sign up for Devpost to join the conversation.