Inspiration

While learning how MCP servers and clients worked, we realized AI models don't even know what MCP stands for. Docs are changing all the time. AI models cannot be trained on the "newest" thing, but they can be given context on the latest.

What it does

Dontext is an MCP server that allows LLMs to receive context from a website.

How we built it

We built it using Typescript and the firecrawl-simple fork. The MCP server has tools that take advantage of the firecrawl-simple API. We tuned the API requests to get clean markdown context for the LLM.

Challenges we ran into

We've never built an MCP server or used firecrawl. Getting all of the configuration set up. Many of the settings were niche and hard to figure out.

Accomplishments that we're proud of

We tested Dontext on Next.js, Svelte, Tailwind, and other documentation sites. Clean markdown was returned and LLMs were able to correct mistakes and write accurate code.

What we learned

Our first idea didn't work out. Our goal was to make a multi-platform AI chat client with MCP support. We realized after a couple hours that this is not going to be possible by 6 and pivoted to build our 2nd idea. When you are crunched for time, execute. When you realize there is a moat you cannot cross, build across it.

What's next for Dontext

Building more tools that enable LLMs to pull context from part of a website and other sources, like your personal calendar. Our goal is to provide LLMs with context that helps you.

Built With

Share this project:

Updates