Project Submission
Inspiration: The Digital Graveyard
The internet is haunted. While we build shiny UIs, the bedrock of our global economy is built on the bones of dead protocols.
Interfacing with these systems is a critical risk. 90% of the world's credit card transactions still flow through proprietary, 40-year-old mainframe systems. Manual parsing of these archaic protocols is the primary reason why Injection (OWASP A03) remains a top security vulnerability.
Due to this reason, I built the Protocol Resurrection Machine (a Universal Necromancy Engine). It is designed to take a dead protocol's DNA (a simple spec) and instantly stitch together a modern, living body, generating mathematically correct, secure, type-safe code for Rust, Go, Python, and TypeScript simultaneously.
What It Does
PRM is a comprehensive meta-programming system that transforms declarative intent into executable reality.
Universal Code Generation: You define the protocol structure in a simple YAML schema. PRM compiles this into idiomatic, type-safe code for TypeScript, Python, Go, and Rust; automatically handling binary packing, buffer management, and socket communication.
Interactive Workbench: A visual IDE that doesn't just edit code, it simulates it. It features a Cinematic Topology visualizer to watch data packets flow in real-time and a "Smart Auto-Fix" doctor that detects and repairs logic errors in your spec.
Instant MCP Integration: With a single click, PRM generates a Model Context Protocol (MCP) server. This allows AI Agents (like Claude or Kiro) to natively "speak" these dead protocols, bridging the gap between 1990s tech and 2024 AI.
Protocol Discovery: A built-in network sniffer that scans ports, fingerprints running services, and identifies unknown protocols automatically based on heuristic signatures.
Simulation and Verification: The system spins up a live "Mesh Network" in the browser to verify that your Rust sensor can talk to your Go gateway without data corruption.
How I Used Kiro
Kiro was my Chief Architect. It used its advanced features to simulate an entire engineering department:
Spec-Driven Architecture: I didn't write code first. I wrote about what I wanted it to do, and it came up with the requirements and the design. Kiro helped me to define 29 Correctness Properties (e.g., "For any valid message M,
parse(serialise (M)) == M") before writing a single line of TypeScript.AI Steering: I fed Kiro custom steering documents like
rust-idioms.md,go-idioms.md, to teach it the difference between "Valid Code" and "Idiomatic Code." This ensured the generated Rust used zero-copy lifetimes ('a) and the Go code used proper error handling patterns.Agentic Workflow (The QA Team): I created custom Agent Hooks (
.kiro/hooks/) to automate our quality assurance. Every time I saved a specification, Kiro triggered Property-Based Tests (usingfast-check), fuzzing our parsers with thousands of random inputs to mathematically guarantee correctness.Traceability: Every feature in the final codebase is linked to a specific requirement in the Kiro specs, preventing scope creep and ensuring alignment with the core value proposition.
Challenges Encountered
Developing a cross-language compiler introduced complex engineering problems:
The "Double-Eating" Desynchronization Bug (Go/Rust): The early parsers crashed on valid inputs because
bytes.Cutconsumed a delimiter that the validation logic tried to check again. This logic gap is a potential HTTP Request Smuggling vector. I then implemented "Lookahead Logic" in the generator templates to dynamically optimise validation paths, ensuring the parser never desynchronizes from the byte stream.The "Ghost Field" Crash (Python): My tokeniser extracted variables that didn't exist in the data class definitions, causing runtime failures. I built a "Symmetry Check" into the validation layer to ensure the AST for the Parser exactly matches the AST for the Structs before the generation logic runs.
Visualising Concurrency: Rendering real-time packet flow caused the topology graph to flash and jitter. I architected a "Puppeteer Pattern" in Svelte: rendering the graph once statically and using a high-performance animation queue to inject CSS classes into the SVG DOM, decoupling the visual experience from the data reality.
Binary vs. Text Ambiguity: Designing a single YAML format for both text (Newlines) and binary (Fixed-Width) protocols was difficult. I built a Type Inference Engine (ProtocolDoctor) that scans format strings and auto-injects the correct numeric types (
u16) or string terminators to prevent generation failures.
Accomplishments
- OWASP A03 Mitigation: The generated parsers treat input strictly as data, never as code. By generating strict State Machines, I eliminated the ambiguity that leads to Injection attacks.
- The "Triangle" Simulation: Successfully running a secure Rust IoT Sensor transmitting binary data to a Go Gateway and a Python Analysis Bot with zero data corruption was an "Eureka" moment.
- The "Summon Daemon" UI: I demonstrated that security tools don't have to be boring. The Halloween-themed interface transforms protocol engineering into cyber-necromancy.
- Full Stack Coherence: Verified that all 4 language implementations (Rust, Go, Python, TypeScript) compiled and passed round-trip correctness tests, proving the core universal compiler logic is sound.
Lessons Learned
- The Power of Declarative Specs: Defining what a protocol is rather than how to parse it reduces bugs and prevents drift across implementations.
- Language Idioms Matter: Code cannot be translated line-by-line. Respecting each language's unique constructs, Rust
Result, Go error returns, and Pythonasyncio, produces readable, maintainable code. - Simulation is Truth: Static analysis is insufficient. True understanding came from visualising invisible network traffic in the Workbench simulator.
What's Next
- Automatic CVE Scanning: Integrating with SAST tools to verify generated code against known weakness databases.
- Wireshark Dissector Generation: Automatically producing
.luascripts for instant traffic inspection. - Fuzzing-as-a-Service: Using the MCP server to let AI agents fuzz-test live endpoints.
- Reverse Engineering: AI-assisted generation of YAML specs directly from raw network traffic (PCAP files).
Built With
- fast-check
- go
- kiro
- model-context-protocol
- node.js
- python
- rust
- sveltekit
- tailwind-css
- typescript
- vercel

Log in or sign up for Devpost to join the conversation.