Inspiration
The first two hours of a maintainer’s day are entirely wasted. You wake up, open your laptop, and drown in context-switching: 20 unread merge requests, 3 failed pipelines from the night shift, stale Jira tickets, and chaotic Slack threads. Before you write a single line of code, your mental energy is depleted.
Furthermore, in the rush to ship features, we noticed a silent killer: Infrastructure Bloat. Developers blindly copy-paste Terraform or Docker configs, over-provisioning cloud resources. This skyrockets both cloud bills and carbon emissions, yet no one notices until the end-of-month audit.
We didn’t want another chatbot to ask questions to. We wanted a teammate. We built Nobi: an Autonomous AI Tech Lead that clears the noise, takes action on broken pipelines, and silently enforces cloud sustainability.
What it does
Nobi operates completely in the background via GitLab Triggers and Headless CLI, serving three primary roles:
- The Morning Briefing (Knowing Everything): Every morning at 8:00 AM, Nobi analyzes your entire repository's activity over the last 24 hours. It synthesizes merged PRs, highlights blocked issues, and summarizes team velocity into a clean, mobile-friendly GitLab Issue. You know exactly where your project stands in 60 seconds.
- The Reactive SRE (Doing Things): When a CI/CD pipeline breaks, Nobi doesn't just send an alert. Using GitLab Triggers, Nobi wakes up, reads the raw job logs, isolates the bug, writes the corrected code, and autonomously opens a Fix Merge Request. You just review it and tap "Merge."
- The Proactive GreenOps Architect: When an MR modifies infrastructure (like Terraform), Nobi connects to Google Cloud. It calculates the carbon and cost impact of the proposed changes, and if it spots waste, Nobi autonomously commits a greener alternative (e.g., suggesting energy-efficient ARM
t2ainstances over legacye2instances).
How we built it
We architected Nobi natively on the GitLab Duo Agent Platform, heavily utilizing Anthropic's Claude 3.5/4.5 Sonnet for deep reasoning.
- Layered Agent Flow (YAML): To ensure enterprise-grade security and prevent prompt-injection, we designed a custom
.gitlab/duo/flows/nobi.yamlusing a multi-agent routing system. We split Nobi into three distinct personas: an Investigator Agent (Read-Only), a Green Architect (MCP data fetcher), and an Executive Agent (Write-Only). - Google Cloud MCP Server: To achieve the GreenOps functionality, we built a custom Node.js Model Context Protocol (MCP) server. This server securely connects Nobi to the Google Cloud Recommender and Carbon Footprint APIs, allowing the agent to fetch real-time infrastructure data.
- Event-Driven Triggers & Headless CLI: We utilized GitLab's native
Automate > Triggersto wake Nobi up on pipeline failures or MR assignments. For the Daily Briefing, we utilized a scheduled CI/CD pipeline running the new GitLab Duo CLI (duo run) in Headless Mode to execute complex autonomous tasks without human intervention.
Challenges we ran into
- Avoiding the "Lethal Trifecta" of Prompt Injection: Giving an AI write-access to a repository is dangerous. We struggled initially with balancing autonomy and security. We solved this by implementing the Layered Flow Architecture. The Agent reading the potentially malicious MR code is mathematically disconnected from the Agent that has permission to execute the
create_merge_requesttool. - Secure Cloud Integration: We wanted to query GCP for carbon metrics, but hardcoding Service Account keys in our MCP server felt like a security anti-pattern. We overcame this by implementing GitLab Workload Identity Federation (WLIF), allowing Nobi to authenticate to Google Cloud entirely passwordless via native OIDC
id_tokens. - Managing LLM Context Limits: Parsing massive CI/CD failure logs alongside large MR diffs pushed the context limits. We had to refine our Agent's system prompts to aggressively filter out noise before passing the context to the Executive Agent.
Accomplishments that we're proud of
- Moving beyond Chat: We successfully built an agent that takes action. Seeing Nobi autonomously read a failed pipeline log and generate a perfect Fix Merge Request without human prompting was a massive "Aha!" moment.
- The Custom GCP MCP Server: We are incredibly proud of bridging GitLab Duo with Google Cloud via the Model Context Protocol. It turns an isolated LLM into a hyper-aware Cloud Architect.
- Delivering Real World Value: We didn't just build a toy; we built a tool that directly reduces developer burnout (by filtering noise) and reduces environmental impact (by optimizing cloud compute).
What we learned
- The true power of MCP: The Model Context Protocol is a game-changer. It shifts the paradigm from "prompt engineering" to "tool engineering," allowing us to infinitely expand what GitLab Duo can understand.
- Flow Orchestration: We learned how to write robust YAML flows that pass state between multiple specialized agents, realizing that 3 narrow agents perform infinitely better than 1 generalist agent.
- The Carbon Impact of Code: We learned just how drastically a single line of Terraform configuration can alter a company's carbon footprint, and how uniquely positioned CI/CD AI agents are to catch it.
What's next for Nobi
This is just the beginning for Nobi. Our immediate next steps include:
- ChatOps Integration: Connecting Nobi's Headless CLI to a Slack/Telegram bot, allowing maintainers to request ad-hoc briefings or trigger pipeline fixes directly from their mobile chat apps.
- Expanding MCP Tools: Adding integrations for Kubernetes cluster auto-scaling, allowing Nobi to dynamically scale down non-prod environments on weekends to save carbon.
- Auto-Resolving SAST: Wiring Nobi directly into GitLab's native Security scanners so it can autonomously patch vulnerabilities the moment they are detected.
Built With
- duo
- gcp
- gitlab


Log in or sign up for Devpost to join the conversation.