Inspiration The chat box is a bottleneck for superintelligence. When developing an AI-native operating system like SymbiOS, the digital infrastructure is only half the battle; the physical human-computer interface remains stuck in the 90s paradigm of clicking icons. We realized that true deterministic AI—agents that actually take over the UI and execute tasks—requires a physical safety mechanism and a tactile command center.
Drawing from years of mixing tracks in live DJ sets, the inspiration hit: tactile, physical controls are the most intuitive way to manage complex, concurrent streams of output. If we can mix audio on a deck, we can mix multi-LLM compute on a dial. We needed a physical "clutch" for human-in-the-loop agent handoffs, and Logitech's MX ecosystem provided the perfect hardware foundation to build it.
What it does Synapse transforms the Logitech MX Master 4 and MX Creative Console into the physical control layer for autonomous agents and distributed LLM orchestration.
The Agent Clutch (MX Master 4): By pressing and holding the Actions Ring, users engage a hardware-level "clutch." This instantly unmutes a low-latency voice pipeline (powered by DifficultAI) and hands over OS cursor control to our deterministic visual agent, Jayu. You speak your intent, the agent drives the UI. Releasing the ring acts as an absolute hardware interrupt, instantly dropping the agent's control and returning the machine to the human.
The Kernel Mixer (MX Creative Console): The console is repurposed from an app switcher into a multi-LLM control board. Dials dynamically allocate compute weighting between local nodes and cloud models. Keypad taps execute zero-latency context switches between specialized voice-native personas (e.g., swapping from a VibeCoder agent to a web-scraping agent).
How we built it We engineered a high-throughput, sub-50ms latency TypeScript monorepo.
At the edge, we built a custom Logitech Actions SDK plugin that intercepts raw hardware events (ring presses, dial rotations) and streams them via WebSockets to our synapse-core-daemon—a Fastify-powered local server.
Inside the daemon, an XState machine manages the critical safety logic of the system. When the clutch is engaged, the daemon simultaneously opens a LiveKit/Deepgram voice stream and triggers the Jayu API to initialize visual OS-level hooks. Dial rotations from the Creative Console are mapped directly to the SymbiOS kernel, adjusting token limits and model routing on the fly.
Challenges we ran into Building a physical bridge to an AI operating system is a latency war. The most significant challenge was guaranteeing the absolute authority of the "clutch release." If an autonomous agent is driving the mouse to execute a complex sequence, the hardware release event must penetrate the processing stack in less than 50 milliseconds to forcefully sever the agent's UI hook.
Additionally, mapping continuous integer data from the Creative Console's dials to dynamically scale LLM context windows without crashing the local node required writing highly debounced, rate-limited middleware in the core daemon.
Accomplishments that we're proud of We successfully elevated productivity peripherals into bleeding-edge infrastructure. We proved that the safest, most seamless way to interact with superintelligence isn't a better software UI—it's a physical, tactile human-in-the-loop system. We integrated our proprietary voice-to-voice architecture and visual agents directly into a commercially available hardware ecosystem without compromising speed or safety.
What we learned We discovered the massive untapped potential of the Logitech Actions SDK when decoupled from standard application plugins and instead wired directly into a centralized OS daemon. We also learned that providing physical, tactile feedback for AI context switching drastically reduces the cognitive load of managing concurrent autonomous tasks.
What's next for Synapse Synapse will become the standard physical interface layer for the Tiny Window ecosystem. We are expanding the Kernel Mixer's capabilities to support granular, on-chain compute allocation directly from the dials. Our immediate focus is hardening the core daemon's dead-man switch protocols as we prepare to step on stage and demo a live, hardware-driven autonomous agent handoff at the finals in Switzerland.
Built With
- deepgram
- fastify
- livekit
- logitechactionssdk
- node.js
- pnpm
- python
- react
- typescript
- vite
- websockets
- xstate
- zod
Log in or sign up for Devpost to join the conversation.