Inspiration

As developers, we spend our lives inside Integrated Development Environments like VS Code. We rely on linters, IntelliSense, reference tracking, and debuggers to manage complexity.

Writing a novel has the same structural problems as writing large software systems. Characters behave like classes. Plot threads resemble functions. Timelines act like execution flow.

Yet writers are still working in linear tools like Google Docs or Microsoft Word.

If a writer changes a character’s eye color in Chapter 1 but forgets to update it in Chapter 20, that is a bug. A plot hole.

We asked a simple question. Why do developers have IDEs, but writers do not?

Unix was built to answer that.

What Unix Does

Unix is a distraction free writing environment that works like a code editor for narrative text.

The Editor A clean markdown based editor Line numbers and focus mode Designed to handle long form documents without performance issues

The Wiki A structured sidebar for story lore Characters, locations, items, and rules of the world Acts as a single source of truth for the narrative

AI Linter Powered by Google Gemini Continuously analyzes the story against the Wiki Flags contradictions and logical inconsistencies Examples include character behavior conflicts, broken world rules, and continuity errors Pacing Analytics Visualizes emotional intensity across the story Uses sentiment analysis to generate heatmaps Helps writers identify slow sections, spikes, and pacing issues How We Built It

Unix is a modern web application optimized for speed, responsiveness, and low latency.

Frontend Next.js TypeScript Tailwind CSS Custom text editor built for large documents Backend and Data Supabase for authentication, storage, and realtime capabilities Lore and project data stored as structured state State Management Zustand for global and editor state Wiki updates instantly propagate to the AI context AI Layer Google Gemini 3 API Large context window enables deep narrative understanding Only relevant Wiki entries and current chapters are injected into context Narrative Pacing Model

To analyze pacing, we model emotional intensity over time.

We define a pacing score P(t) for each paragraph t.

Where: S(t) is the sentiment polarity vector returned by the AI, ranging from negative one to one W(t) is the word count density of the paragraph Tavg is the average reading time Alpha and beta are weighting coefficients tuned during testing

This allows Unix to visualize narrative momentum rather than just raw sentiment.

Challenges We Faced

Context Management Even with a large context window, sending an entire novel on every request is expensive and slow.

We implemented a Retrieval Augmented Generation pipeline that fetches only the Wiki entries relevant to the current scene.

Prompt Engineering By default, Gemini wants to rewrite text. We had to carefully design system instructions so the model behaves like a linter, identifying issues without altering the author’s voice.

Real Time Feedback

We wanted feedback to feel instant. Achieving this required aggressive debouncing, background processing, and careful API call scheduling to avoid typing lag.

What We Are Proud Of The context aware experience where the AI correctly flags inconsistencies introduced many chapters earlier A UI that feels like a serious developer tool while remaining approachable for writers Strong collaboration across frontend and AI logic to ship a cohesive product quickly What We Learned

Large Language Models excel at structural reasoning.

They are not just generators. They are powerful validators.

By framing storytelling as a logic and state management problem, we unlocked a new way to use generative AI. We also gained deep experience working with Gemini structured outputs and long context management.

What Is Next for Unix Live collaboration so multiple writers can work on the same project in real time

Share this project:

Updates

posted an update

Just completed Unix Added .unixrc : used for the rule set and environment behaviour Wiki : used to handle data of characters and information like place, people, items etc in which the ai make use of for consistent context aware generation Ai analysis: this is for running scans on lore holes and inconsistency (for example you are being quite inconsistent when casually writing it helps you and ask if you want the Ai to suggest an edit to retain your context but put the character back on track)

Log in or sign up for Devpost to join the conversation.