Inspiration

We were inspired by the idea of combining the power of Copilot Chat and Zed AI inside Neovim — bringing cutting-edge LLM capabilities directly into a fast, lightweight, and customizable editor. We wanted to create a developer-first experience where AI becomes a natural part of the coding workflow, not a distraction.

What it does

Code Companion seamlessly integrates a wide range of LLMs — including Anthropic, GitHub Copilot, OpenAI, DeepSeek, Gemini, Mistral AI, Novita, HuggingFace, xAI, Ollama, and even custom models — into Neovim. It supports:

  • Inline code transformations, generation, and refactoring

  • Context-aware variables and slash commands

  • Prompt library for error explanations, code suggestions, and more

  • Vision and image input support

  • Multi-chat sessions

  • Fast asynchronous execution All designed to supercharge your productivity directly in your terminal.

How we built it

We built Code Companion as a Neovim plugin using Lua, with support for async execution to maintain performance. We designed a modular architecture to support user-contributed adapters and built-in hooks for prompt customization. We also created a flexible chat system that enables multiple conversations in parallel and supports vision input for multimodal interaction.

Challenges we ran into

  • Managing performance while handling multiple asynchronous tasks

  • Designing a unified interface that works across very different LLM APIs

  • Ensuring the plugin remains lightweight and doesn't slow down Neovim

  • Making prompt customization powerful yet user-friendly

  • Handling image input within a text-based editor

Accomplishments that we're proud of

  • Supporting a broad range of LLMs with community-contributed adapters

  • Creating a fully async, highly responsive chat experience in Neovim

  • Building a flexible system for custom slash commands, variables, and prompt workflows

  • Enabling multimodal (text + vision) interaction within a terminal-based editor

What we learned

  • Building a real-time AI experience inside a terminal environment requires deep knowledge of async design patterns

  • Developers love flexibility — giving them tools to customize prompts and workflows was key

  • There's growing demand for integrating AI deeply into existing dev tools like Neovim, not replacing them

What's next for Code Companion

  • Expanding support for more LLMs and services

  • Adding richer GUI components (optional) for visual previews and debugging

  • More intelligent context awareness (e.g. automatic prompt generation from buffer content)

  • Supporting fine-tuned workflows for specific domains like frontend, backend, and data science

  • Growing the community of adapter contributors and plugin users

Built With

Share this project:

Updates