Inspiration
We were inspired by a simple but painful truth: developers hate writing documentation. It's often the last thing anyone wants to do, and even when it gets done, it quickly becomes outdated. Static documentation rarely keeps up with evolving codebases, making it hard for teams to maintain internal knowledge. This becomes a real problem when developers leave and no one understands how the system works. Onboarding new team members becomes time-consuming and frustrating. Even worse, when software products are shipped with great features but poor documentation, it leads to low adoption and support overhead. We built llmao.ai to solve this problem once and for all - by automating documentation in a way that's both accurate and useful.
What it does
Companies can internally host an instance of llmao.ai to automate their documentation workflows.
- Implementation Documentation (internal to a company): The platform semantically analyzes a codebase (such as a GitHub-hosted repository), identifies relationships between modules, and captures their functionality as well as the overall system architecture.
- API/SDK References: It extracts the purpose, input, and output parameters of public APIs (e.g., those available through SDKs) from the repository.
Developers have the option to review the generated documentation before it is published to internal knowledge bases or public platforms. Moreover, llmao.ai allows developers to interact with the documentation through AI assistants, enabling them to get specific answers to queries without reading the entire document.
How we built it
Our application is powered by Letta Cloud, Google Gemini, v0 by Vercel and GitHub MCP Server.
Challenges we ran into
This was all of our first time working with Letta Cloud, but it was a challenging yet rewarding experience.
Accomplishments that we're proud of
We’re proud that llmao.ai can take any GitHub repository and generate clean, readable, and technically sound documentation - both proprietary and public. The dual-documentation flow using specialized agents was an ambitious idea, and we made it work. We also built a real-time AI chat experience that can answer technical questions with context-aware precision. Finally, we designed the system to be scalable and secure, with support for self-hosting in companies - making it a practical solution for real-world development teams.
What's next for llmao.ai
We’re already working on several next steps. These include automatically triggering doc updates when the main branch is updated, and supporting multi-repo systems that span across several services. We also plan to generate release notes directly from pull requests, making changelogs easier to manage. We’re building integrations with popular platforms like Confluence, Notion, and Slack, so teams can push docs directly into their workflows.
Log in or sign up for Devpost to join the conversation.