Inspiration

The rise of vibe coding has fundamentally changed how developers interact with their codebases. AI-powered coding assistants have made it possible to build faster than ever; but they've also introduced a new problem: context collapse.

When multiple developers (or the same developer across sessions) use LLM-powered tools to modify a codebase, each interaction happens in isolation. The AI makes changes without understanding why previous decisions were made. Other team members inherit code they didn't write, with no record of the reasoning behind it. Documentation becomes stale the moment it's written, and prompting LLMs to maintain it feels like shouting into the void.

I wanted to build something that could watch, learn, and remember; a daemon that lives alongside your codebase, capturing the thinking behind every change, not just the change itself.

What it does

Chronicle is a living documentation system powered by Gemini 3 Pro that automatically generates, verifies, and self-corrects project documentation. It runs as a background daemon alongside your codebase, watching for changes and maintaining documentation that evolves with your code.

The Three-Part Architecture

1. Background Daemon

The daemon is the heart of Chronicle, powered by Gemini 3 Pro's advanced reasoning capabilities. It:

  • Watches your filesystem using Chokidar for real-time file change detection
  • Analyzes code with Gemini 3 Pro using the @google/genai SDK for intelligent understanding
  • Leverages Gemini's ThinkingLevel API (HIGH/LOW) to calibrate reasoning depth per task
  • Generates vector embeddings via Gemini's embedding models for semantic code search
  • Logs all AI reasoning including Gemini's thoughtSignature tokens to a central "Thinking Platform"

2. Command-Line Interface

Developers interact with Chronicle through an intuitive CLI that manages the entire documentation lifecycle:

Service Management:

  • chronicle start launches both the background daemon and the web dashboard. The daemon begins watching for file changes while the UI becomes available at localhost:3001.
  • chronicle stop gracefully shuts down all Chronicle services.

Project Setup:

  • chronicle add [path] registers a directory for monitoring. Chronicle will track all code changes in this project and trigger Gemini analysis when files change.
  • chronicle remove stops monitoring the current directory.
  • chronicle init [path] performs initial Gemini-powered analysis of a project, generating comprehensive documentation from scratch.

Documentation Control:

  • chronicle pause [path] temporarily suspends automatic documentation updates; useful during large refactors.
  • chronicle resume [path] re-enables automatic updates.
  • chronicle diff [path] shows documentation version differences, letting you see how docs evolved over time.
  • chronicle versions [path] lists all documentation snapshots for a project.

Observability:

  • chronicle queue [action] manages the job queue, check status, clear pending jobs, or kill a stuck job.
  • chronicle log streams real-time activity from the daemon, showing Gemini's analysis as it happens.

3. Web Dashboard UI

A Next.js-powered dashboard running at localhost:3001 provides:

  • Project sidebar listing all monitored codebases
  • Documentation viewer with rendered markdown and mermaid diagrams
  • Jobs panel showing real-time Gemini processing status
  • Thinking Platform drawer visualizing Gemini's AI activity, reasoning depth, and thought signatures

The Prompt → Assume → Verify Pipeline

The core innovation that makes Chronicle "self-healing", powered entirely by Gemini 3 Pro:

┌─────────────┐     ┌─────────────┐     ┌─────────────┐
│  Generate   │ ──▶ │   Extract   │ ──▶ │   Verify    │
│    Docs     │     │ Assumptions │     │  Against    │
│  (Gemini)   │     │   [CLAIM]   │     │    Code     │
└─────────────┘     └─────────────┘     └─────────────┘
                                               │
                                               ▼
                                    ┌─────────────────┐
                                    │ Self-Correcting │
                                    │  Documentation  │
                                    └─────────────────┘
  1. Generate: Gemini 3 Pro with ThinkingLevel.HIGH analyzes code and generates comprehensive documentation
  2. Assume: Uncertain claims are marked with [ASSUMPTION: claim] tags by Gemini
  3. Verify: Gemini cross-references each assumption against actual source code
  4. Correct: False assumptions trigger Gemini-powered rewrites with before/after logging

Key Features

Gemini 3 Pro Integration

Chronicle uses the @google/genai SDK to connect directly to Gemini 3 Pro. Every documentation task; from analyzing a new codebase to verifying an assumption, flows through Gemini. The SDK handles authentication, request formatting, and response parsing, while Chronicle manages the orchestration layer: queuing tasks, tracking progress, and logging results.

ThinkingLevel Configuration

Gemini 3 Pro supports configurable reasoning depth through its ThinkingLevel API. Chronicle dynamically selects the appropriate level based on task complexity:

  • ThinkingLevel.HIGH is used for architectural analysis, initial project reconnaissance, and complex multi-file documentation where deep reasoning improves output quality.
  • ThinkingLevel.LOW is used for quick tasks like generating changelog entries or simple file summaries where speed matters more than exhaustive analysis.

This calibration ensures we're not wasting compute on trivial tasks while still getting thorough reasoning when it matters.

Thought Signature Tracking

When Gemini 3 Pro engages in extended reasoning, it returns a thoughtSignature token in the response. Chronicle captures these signatures and displays them in the Thinking Platform UI with a "✓ Extended reasoning used" indicator. This gives developers visibility into when Gemini was thinking deeply versus responding quickly, crucial for understanding documentation quality.

Intelligent File Watching

The daemon uses Chokidar to monitor filesystem changes in real-time. It's configured to ignore noise: node_modules, .git, build outputs, and other non-essential paths are filtered out. When a meaningful file changes (source code, configs, existing docs), Chronicle enqueues a job to re-analyze that portion of the codebase.

Thinking Platform

Every interaction with Gemini is logged to a SQLite database with:

  • Prompt summary: What Chronicle asked Gemini to do
  • Response preview: A snippet of what Gemini returned
  • Thinking indicator: Whether extended reasoning was engaged
  • Timestamps and duration: Performance tracking
  • Activity type: Categorized as analyze, generate, plan, embed, or action

The web UI renders this as a real-time activity feed, letting developers see exactly what the AI is doing at any moment.

Assumption Tracking

When Gemini generates documentation, it may not have complete information about the codebase. Rather than hallucinating, Chronicle instructs Gemini to explicitly mark uncertain claims with [ASSUMPTION: claim here] tags. These assumptions are extracted and stored in the database with references to the source document. A separate verification job later checks each assumption against actual code.

Self-Correction Actions

When an assumption is verified as FALSE, Chronicle doesn't just flag it, it fixes it. Gemini generates a correction, and Chronicle:

  1. Updates the documentation file, replacing the assumption with verified information
  2. Logs an "Action" entry in the Thinking Platform showing before/after
  3. Records the correction reason for audit purposes

This creates self-healing documentation that improves over time.

Gemini Embeddings

Chronicle generates vector embeddings for code files using Gemini's embedding model. These embeddings are stored in SQLite via the sqlite-vec extension, enabling semantic search. When generating documentation, Chronicle can find conceptually related files even if they don't share naming conventions—improving cross-referencing and context gathering.

Multi-Stage Documentation

Documentation isn't generated in one shot. Chronicle follows a staged approach:

  1. Stage 1: Project reconnaissance and structure analysis
  2. Stage 2: Core documentation (README, ARCHITECTURE)
  3. Stage 3: Module-level documentation for each package
  4. Stage 4: Detailed implementation docs with diagrams

Each stage builds on the previous, ensuring dependencies are documented before dependents.

Version History

Every documentation generation creates a new version snapshot. Chronicle tracks these versions in the database, allowing developers to see how documentation evolved, compare versions, and understand what changed between updates.

How we built it

Chronicle is a three-part monorepo:

  1. Daemon (TypeScript) - Background service using Gemini 3 Pro's @google/genai SDK with ThinkingLevel configuration for calibrated reasoning. SQLite with sqlite-vec for vector embeddings.

  2. CLI (TypeScript) - Commands like chronicle init, chronicle diff, and chronicle log for developer interaction.

  3. UI (Next.js) - Dashboard featuring real-time AI activity streaming, job queue visualization, and the Thinking Platform drawer.

Built entirely with Google's Antigravity IDE, leveraging its tight code-test-iterate feedback loop to rapidly develop across all three packages.

Challenges we ran into

The Streaming Thoughts Problem

My initial goal was to stream the AI's chain-of-thought in real-time. I quickly discovered that Gemini 3 Pro doesn't expose streaming thoughts. The thoughtSignature confirms extended reasoning happened, but the actual thinking remains opaque.

The Solution: Prompt-Assume-Verify

Instead of extracting thoughts directly, I designed a system that makes assumptions explicit:

  1. AI marks uncertain claims with [ASSUMPTION: claim] tags
  2. Assumptions are stored in the database
  3. Verification jobs check each assumption against actual code
  4. False assumptions trigger self-correction with before/after diffs logged as "Actions"

This turned a limitation into a feature, an auditable trail of what the AI thought was true versus what it verified to be true.

Fast Jobs, Slow Polling

Jobs were completing faster than the UI's 2-second refresh, making the system appear idle. Fixed by tracking updated_at timestamps and showing recently-completed jobs as active.

Accomplishments that we're proud of

  • Self-healing documentation that catches and corrects its own mistakes
  • Thinking Platform that makes AI reasoning transparent and reviewable
  • Zero-config file watching that just works in the background
  • Multi-stage documentation pipeline with proper dependency ordering
  • Successfully leveraging Gemini 3 Pro's thinking levels for optimal quality/speed tradeoffs

What we learned

Thought Signatures in Gemini 3 Pro

Working with thoughtSignature tokens taught us how to detect when extended reasoning was used, even without access to the full chain-of-thought.

The Power of Thinking Levels

Gemini 3 Pro's ThinkingLevel configuration (HIGH, LOW) is crucial for calibrating AI effort to task complexity:

  • HIGH for architectural analysis and complex reasoning
  • LOW for quick summaries and simple tasks

Antigravity IDE Workflow

Building with Antigravity demonstrated how AI-assisted development can dramatically compress development timelines for complex multi-package projects.

What's next for Chronicle

Chronicle Cloud

To make Chronicle truly effective at scale, we're building a cloud-hosted version that integrates directly with GitHub, GitLab, and Bitbucket. This removes the need for local daemons and enables teams to get living documentation without any setup.

Vulnerability Scanning & Threat Protection

Using Gemini 3 Pro's Google Search grounding to look up the latest security threats and vulnerabilities as they emerge, combined with CVE database integration for comprehensive coverage. Chronicle will scan codebases for known vulnerabilities, outdated dependencies, and security anti-patterns—documenting risks as they're discovered.

PR Integration & Auto-Patching

The ultimate goal: Chronicle doesn't just document problems, it fixes them. Automatic pull request generation for:

  • Documentation updates when code changes
  • Security patches when vulnerabilities are detected
  • Assumption corrections when verification fails

From passive observer to active contributor.

Built With

Share this project:

Updates