Inspiration

We're obsessive readers of serial fiction — web novels that run 1,000+ chapters. We started by researching what separates great serial fiction from mediocre ones: what makes characters stick, how foreshadowing compounds over hundreds of chapters, why some world-building feels alive while others collapse under their own weight.

Originally, we just wanted to build a methodology — a structured playbook for writing commercially viable long-form fiction. No software, just documents and frameworks. But when we tried handing that methodology to AI chatbots, the results were disappointing. They'd forget characters mid-scene, contradict world rules, and drop plot threads entirely. The methodology was solid; the tools weren't.

That's when we realized: the methodology itself needed to become the software. Not just a reference doc, but a living system that feeds structured story knowledge directly into AI generation. That's Inxtone.

What it does

Inxtone is an AI-native storytelling framework that turns Gemini into a story-aware writing partner. Authors build a Story Bible — characters with three-layer motivations, world rules, faction dynamics, foreshadowing seeds — and the system automatically assembles the right context for every AI generation call.

Write a scene, and Gemini knows your protagonist's voice, remembers the foreshadowing you planted 50 chapters ago, and respects the power system constraints you defined. It's not autocomplete — it's a collaborator that has actually read your entire book.

How we built it

Inxtone was built entirely with Claude Code — and the process itself became part of the lesson.

Two weeks of thinking, four days of coding. Before writing a single line of code, we spent two weeks building the specification layer: a Product Requirements Document (features, user stories, MVP scope), a full Architecture spec (interaction layer, business logic schemas, data layer, module design), a Design Language System (79 components with CSS specs, animation system, responsive breakpoints), and phased Milestone plans (M1–M5) with acceptance criteria and test plans for each.

The key insight: when you give AI a clear, structured specification, the code almost writes itself. Our CLAUDE.md project file pointed Claude Code to exactly which docs to read for any given task. Architecture Decision Records kept major choices documented. A Regulation doc defined git workflow, naming conventions, and testing standards — so Claude Code followed the same rules a human engineer would.

The development timeline tells the story:

  • Feb 5: M1 (Foundation) — monorepo setup, TypeScript interfaces, database schema, CLI + server + web shells
  • Feb 7: M2 (Story Bible Core) — 7 repositories, service layer with EventBus, 45 REST API endpoints, full web UI, CLI commands, performance benchmarks. 695 tests.
  • Feb 8: M3 (Writing + AI) — WritingService, Gemini integration with five-layer context assembly, SSE streaming, three-panel editor UI, Plot system. 1,001 tests.
  • Feb 9: Hackathon polish — English prompt templates, BYOK API key flow, seed data loader, Welcome Screen, Docker deployment.

Four days from empty repo to a production-ready app with over a thousand tests. Not because the code was simple — it's a TypeScript monorepo with four packages, 18 database tables, and a five-layer context engine — but because the thinking was done before the typing started.

The stack: TypeScript, Fastify, React, SQLite (better-sqlite3), Gemini 2.5 Pro via @google/genai SDK, Vitest, pnpm monorepo. Deployed via Docker on Render.

Challenges we ran into

Translating abstract methodology into concrete features. The hardest part wasn't coding — it was deciding which elements of the storytelling methodology deserve to be interactive features versus static reference data. For example: "three-layer character motivation" (surface/hidden/core) sounds elegant on paper, but implementing it as structured input that meaningfully influences AI generation required careful schema design and prompt engineering.

Knowing what to cut. The methodology covers everything from pacing curves to dialogue cadence analysis. We had to ruthlessly defer features — dialogue mode with character selection, context item toggle controls, brainstorm-to-continue flow, chapter drag-and-drop — to ship a coherent core experience. The full deferred list grew to 11 items by the end.

Getting AI context right. The five-layer system went through several iterations. Early versions sent too much context (Gemini would get confused by irrelevant world-building details) or too little (characters would go off-voice). Priority-based truncation with per-layer budgets was the breakthrough — it ensures the most story-critical information always makes the cut.

Accomplishments that we're proud of

It actually tells coherent stories across long arcs. Serial fiction runs 1,000+ chapters. Human writers can't remember what they wrote 200 chapters ago — but Inxtone's Story Bible ensures the AI always can. Foreshadowing planted in chapter 3 gets properly resolved in chapter 28. Characters stay in voice. World rules hold.

Research-first, code-second. Two weeks of methodology research, two weeks of product thinking, three days of actual coding. When you know exactly what you're building and why, implementation becomes almost mechanical. We believe this is what building products in the AI era should look like.

1,001 tests passing. Across 42 test files — repositories, services, API routes, context builders, prompt assembly, streaming, and performance benchmarks. Not because we love testing, but because a storytelling tool that silently corrupts your narrative data is worse than no tool at all.

What we learned

Don't start coding until the product is crystal clear. Every hour spent on methodology research saved days of refactoring. Business logic issues, UX decisions, data model questions — they all need answers before the first line of code. In the AI era, code is cheap; clarity is expensive.

Good human expertise can be packaged into software fast. The entire Inxtone methodology — years of reading, analyzing, and thinking about serial fiction craft — was translated into a working product in under a week by our team. AI-assisted development makes the gap between "I know how this should work" and "here's a working product" almost trivially small. That's a superpower.

What's next for Inxtone

Use it for real. The ultimate test is writing a commercially viable web novel using our own methodology and tool. If Inxtone can help produce fiction that readers actually pay for, that's validation no demo can match.

Package expertise as software. Inxtone proved that domain knowledge + AI can become a product fast. We want to explore this pattern further — interactive courses powered by AI, tools for other creative domains (screenwriting, game narrative), and frameworks that turn any structured methodology into an intelligent assistant.

Open it up. Multi-model support (Claude, GPT, local models via Ollama), plugin systems, and eventually a template marketplace where writing methodologies themselves become shareable, forkable products.

Built With

  • commander.js
  • css
  • docker
  • eslint
  • fastify
  • github-actions
  • google-gemini-api-(@google/genai)
  • ink
  • pnpm-workspaces
  • prettier
  • react
  • react-router
  • sqlite-(better-sqlite3)
  • tanstack-react-query
  • tsup
  • typescript
  • vite
  • vitest
  • zod
  • zustand
Share this project:

Updates