Inspiration
Let's be honest: We all use AI at hackathons. Last month, I was speed-running a feature using GitHub Copilot. I hit Tab, the code appeared, and it looked perfect.The syntax was clean.The variable names were right.The logic seemed sound.I pushed it. And it failed. Why? Because the AI autocompleted a standard saveUser() function, but it completely ignored a specific business rule hidden in a file three folders away (UserConfig.java) that requires a specific encryption step. The AI didn't make a syntax error; it made a context error. I realized that Generative AI is like a junior developer on unlimited caffeine: It writes code fast, but it has no idea how the whole system fits together. I didn't need another code generator; I needed a code auditor. I built Shadow Compiler not to write code for me, but to protect me from the code I (and my AI) are writing.
What it does
Shadow Compiler is a VS Code extension that acts as a real-time, AI-powered semantic analyzer. It doesn't just check your syntax; it reads your entire project context to understand your business logic. It watches as you type.It understands the relationships between files.It warns you of logic errors (using amber "Ghost" wavy lines) before you even hit run. It is effectively a 24/7 Senior Engineer sitting on your shoulder, whispering, "Hey, are you sure you want to do that without a transaction?"
I built this specifically to leverage Gemini 3's 2M token context window. In enterprise software, a bug in OrderService.java is often caused by a configuration in application.yml five folders away. Standard models (128k context) simply cannot "see" far enough to catch these cross-file integration bugs. By loading the entire application context into memory, Shadow Compiler achieves a 10x measurable impact on developer velocity, catching production-breaking logic errors before the code is ever compiled.
How we built it
The "Ghost" UI: Built on the VS Code Decorator API to inject real-time "wavy line" warnings directly into the editor canvas (no clunky sidebars). The Context Engine: A custom Node.js recursive file walker (using fs.promises) that captures the entire project structure to feed the 2M+ token context window. The Brain (Gemini 3): We leverage Gemini’s massive context to hold the entire codebase + library dependencies in memory, enabling cross-file semantic reasoning. Smart Debouncer: A custom algorithm that intelligently waits for semantic pauses (like typing ; or }) to trigger analysis, ensuring zero UI lag.
Challenges we ran into
Context Leaks: The AI initially confused variables between different methods. We fixed this by engineering a "Strict Scope" system prompt that forces the model to isolate method logic. UI Freezing: Our initial synchronous file reading froze VS Code. We had to refactor the entire I/O layer to be non-blocking and asynchronous. Race Conditions: Users typed faster than the API could reply. We solved this with Document Version Locking—discarding any AI analysis that didn't match the current editor version.
Accomplishments that we're proud of
It feels "Native": We achieved a UX that feels exactly like the standard Java compiler, but smarter. The "Impossible" Catch: Successfully flagging a missing @Transactional annotation based on a database config file located five folders away—something standard linters strictly cannot do. Zero Config: No YAML files, no setup. You open a file, and the "Ghost" is already watching.
What we learned
Prompt Engineering is Coding: Writing the system prompt required as much debugging as the TypeScript code. "Negative Constraints" (telling the AI what not to do) were critical to stopping hallucinations. Latency is the UX Killer: Even a smart tool is useless if it feels slow. The "Smart Debouncer" taught us that perceived speed is just as important as actual speed. Reasoning > Retrieval: You can't just RAG your way through code logic. You need a model (Gemini 3) that can simulate runtime behavior in its head.
What's next for Shadow-Compiler
Auto-Fix Actions: Moving from "warning" to "fixing" (e.g., one-click injection of missing transaction contexts). Polyglot Support: Expanding the context engine to support Python, Rust, and TypeScript. Team Knowledge Graph: Analyzing past git commits so the compiler "learns" from your team's previous mistakes.
Built With
- eslint
- gemini
- mocha
- node.js
- npm
- typescript

Log in or sign up for Devpost to join the conversation.