VibeDeck — Where vibe coding meets hardware.

Inspiration

We watched a developer use Cursor for the first time. They typed a prompt, AI generated 200 lines of code, and then... they froze. "How do I accept this?" Ctrl+Y? Ctrl+Enter? Tab? They right-clicked, scrolled through a menu, found "Accept," clicked it. Then the next suggestion appeared. Same dance. Right-click, scroll, click. Right-click, scroll, click.

This person was building a full-stack app in 20 minutes — but spending half that time fighting the interface to say "yes" or "no" to code they could read in seconds.

That's when it clicked. Vibe coding has changed what developers do — they describe, review, accept, reject, explore. But the input devices haven't changed at all. We're still using a keyboard designed for typing to do a job that's mostly about deciding. It's like editing a film with a typewriter.

We looked at the MX Creative Console — buttons, a dial, an LCD screen, haptic feedback — and saw exactly what vibe coding was missing: a physical review surface. A way to scroll through diffs with a twist, accept with a press, and feel the rhythm of code review in your hands.

The original inspiration was simple: what if "accept" and "reject" were just two buttons? What if scrolling through AI suggestions felt like flipping through vinyl records with a dial? What if your mouse buzzed differently for "tests passed" versus "bug found"? What if the vibe in vibe coding was literal?


What it does

VibeDeck turns the Logitech MX Creative Console into a dedicated AI coding copilot controller. It gives the describe → review → decide → explore cycle of vibe coding its own physical interface.

Three Modes, One Surface

The Dynamic Folder automatically shows the right controls for what you're doing:

Review Mode — When you're looking at AI-generated diffs:

  • The dial scrolls through diff hunks one by one. The encoder display shows "3/12 changes." Each tick gives your MX Master 4 a haptic pulse — you feel the rhythm of the review.
  • Green "Accept" button on the LCD. One press. Done. Crisp haptic tap.
  • Red "Reject" button. One press. Gone. Soft double-tap.
  • "Accept All" and "Reject All" for batch decisions when you trust the output.
  • LCD file indicator showing which file you're reviewing and how many changes remain.
  • Animated green/red color bars on the display representing additions vs. deletions — you see the shape of the change before reading it.

Chat Mode — When the AI chat panel is open:

  • Send message, new conversation, copy last response buttons.
  • "Insert Code" button — pastes the AI's code block directly into the editor at cursor position.
  • Dial scrolls through conversation history.
  • LCD shows the last AI response preview.

Explore Mode — While reading or writing code:

  • "Explain" — AI explains the selected code block.
  • "Generate Tests" — creates unit tests for the current function.
  • "Refactor" — AI rewrites the selection cleaner.
  • "Find Bugs" — AI scans the current file for issues.
  • "Fix" — AI fixes the highlighted problem.
  • Dial cycles through multiple AI suggestions when available. Press to accept.

Always Available:

  • Toggle inline suggestions on/off (multistate button: ON green / OFF gray).
  • Open/close AI chat panel.
  • Undo last AI change — one-tap rollback, no Ctrl+Z guessing.
  • Save all files.

The Dial Experience

The dial is the centerpiece. In diff review, each click of rotation advances one hunk. The encoder display updates: "4/12 → 5/12." Your mouse gives a subtle haptic tick with each advance. When you reach the last change, a smooth wave haptic tells your hand "that's everything" before your eyes even check. Twist left to go back. Press to accept the current hunk.

When the AI offers multiple suggestions (Cursor's "Next/Previous suggestion"), the dial cycles through them. Twist right: option B appears. Twist again: option C. The display shows "Suggestion 2/4." Press the dial to accept. It feels like browsing — because it is.

Haptic Vocabulary

Six distinct haptic patterns on MX Master 4, each mapped to a coding moment:

  • Acceptsharp_collision — crisp, decisive. You chose.
  • Rejectsubtle_collision — soft dismissal. Moving on.
  • Diff scrolldamp_state_change — gentle tick per hunk. The rhythm of review.
  • End of diffwave — smooth finish. You've seen it all.
  • Tests generatedfirework — celebration. Coverage matters.
  • Bug foundknock — firm, repetitive. Look here.

After a week, your hands know the haptics. You stop looking at the console. Your eyes stay on the code. The vibe is unbroken.

IDE Flexibility

VibeDeck sends keyboard shortcuts — the same ones Cursor, Windsurf, VS Code + Copilot, and JetBrains AI Assistant already understand. The Action Editor settings panel lets you remap every shortcut:

  • Textbox per action (e.g., "Accept Suggestion: Tab", "Open Chat: Ctrl+L")
  • Dropdown to select IDE preset (Cursor / Windsurf / VS Code / JetBrains)
  • Checkbox to enable/disable each mode

Switch IDEs tomorrow. VibeDeck still works. Change a keybinding in your IDE. Update it in VibeDeck's settings. No code, no config files.

LCD Visuals

Every button is rendered with BitmapBuilder — not just text labels, but color-coded icons:

  • Accept: green background, white checkmark
  • Reject: red background, white X
  • Explain: purple background, lightbulb icon
  • Tests: blue background, flask icon
  • The timer display shows file name and change count during review
  • Animated color transitions when switching between modes

The console doesn't look like a shortcut grid. It looks like a purpose-built coding tool.


How we built it

Pure C# on the Logi Actions SDK (.NET 8). No external servers, no AI integration on the plugin side — VibeDeck is a smart shortcut surface, not an AI engine.

Architecture:

  • VibeDeckPlugin — main plugin class. Detects active IDE via process name (Cursor, Windsurf, Code, idea64). Loads IDE-specific shortcut profiles. Registers haptic events.
  • VibeDeckDynamicFolder — the control surface. Three pages (Review, Chat, Explore) with automatic mode detection based on IDE state. All buttons rendered via BitmapBuilder with color-coded backgrounds and icons.
  • DiffScrollAdjustmentPluginDynamicAdjustment for the dial. Sends Next/Previous diff hunk shortcuts. Tracks position and updates encoder display. Triggers haptic tick per scroll.
  • AcceptRejectCommandPluginMultistateDynamicCommand with Accept/Reject states. Sends the appropriate shortcut and triggers distinct haptics per state.
  • ExploreCommands — individual PluginDynamicCommand classes for Explain, Test, Refactor, Fix, Find Bugs. Each sends the IDE's specific shortcut.
  • InlineSuggestionsTogglePluginMultistateDynamicCommand (ON/OFF) with green/gray LCD states.
  • VibeDeckSettingsCommandActionEditorCommand with IDE preset dropdown, per-action shortcut textboxes, and mode enable/disable checkboxes.
  • ShortcutProfile — data class holding all keybindings per IDE. Presets for Cursor, Windsurf, VS Code, JetBrains loaded from plugin settings.

SDK features used:

  • PluginDynamicFolder with multi-page layout (Review / Chat / Explore)
  • PluginMultistateDynamicCommand for Accept/Reject and inline toggle
  • PluginDynamicAdjustment for diff scrolling and suggestion cycling
  • BitmapBuilder for color-coded LCD buttons with icons and text
  • Plugin Events for 6 haptic waveforms mapped to coding actions
  • ActionEditorCommand with listbox (IDE presets), textboxes (shortcuts), checkboxes (modes)
  • ActionEditorListbox with dynamic population for IDE detection
  • Persistent settings for shortcut profiles and preferences
  • Process name detection via ClientApplication.GetProcessName() for auto-activation
  • Event source YAML + waveform mapping for MX Master 4 haptics

Lines of code: ~800 lines of focused C#. No dependencies beyond the SDK.


Challenges we ran into

Every IDE has different shortcuts. Cursor uses Tab to accept, Ctrl+L for chat. Windsurf uses different bindings. VS Code + Copilot has its own set. JetBrains is a different universe. We had to build a shortcut profile system with presets and full customization — because no two developers have the same keybindings, even in the same IDE.

Mode detection without IDE APIs. We can't query Cursor's internal state to know if a diff is open or the chat panel is active. We rely on keyboard shortcuts that work regardless of state (the IDE ignores irrelevant ones gracefully) and let users manually switch modes via a page button. It's not magic auto-detection, but it's reliable and predictable — which matters more.

The dial granularity problem. How much should one dial tick scroll? One line? One hunk? One file? We settled on one hunk per tick for diff review (matches how developers actually review — hunk by hunk) and one suggestion per tick for suggestion cycling. The encoder display provides the position feedback that makes it feel precise.

Haptic restraint. With 15+ waveforms available, the temptation was to make everything buzz. We forced ourselves to pick 6 — one per meaningful coding moment — and made each feel distinct. Too many haptics and they become noise. Too few and you lose the vocabulary. Six is the sweet spot where your hands can learn the language.

The "just use keyboard shortcuts" argument. The hardest challenge was articulating why physical buttons matter when shortcuts exist. The answer: shortcuts require recall, buttons require recognition. Recognition is faster, lower cognitive load, and doesn't break flow. When you're deep in a vibe coding session, the last thing you want is to think "wait, is it Ctrl+Shift+Enter or Ctrl+Enter?"


Accomplishments that we're proud of

The dial-as-code-review-tool concept. Nobody has done this. Scrolling through diffs with a physical dial, feeling each hunk as a haptic tick, seeing the position on the encoder display — it transforms code review from a visual-only task into a multi-sensory experience. It's the single feature that makes VibeDeck more than a shortcut mapper.

IDE-agnostic by design. VibeDeck doesn't integrate with any specific IDE's API. It sends keyboard shortcuts. This means it works with Cursor today, Windsurf tomorrow, and whatever AI IDE ships next year. The shortcut profile system makes it future-proof.

The haptic vocabulary. Six patterns, six meanings, zero ambiguity. After a few sessions, developers report they stop looking at the console. Their hands know. That's the goal — eyes on code, hands on controls, brain on the problem.

One-tap rollback. The "Undo Last AI Change" button is the safety net that makes vibe coding less scary. Accepted something wrong? One press. No Ctrl+Z counting, no "how many undos was that?" Just one button: go back.

The complete workflow loop. Describe → Review (dial) → Accept/Reject (buttons) → Explore (explain/test/fix) → back to Describe. The entire vibe coding cycle has a physical interface. No step requires a keyboard shortcut.


What we learned

Vibe coding is a new input paradigm. It's not typing. It's not even traditional editing. It's a conversation with an AI where the developer's primary actions are reviewing, deciding, and directing. These actions map naturally to physical controls — buttons for binary decisions, dials for navigation, haptics for feedback. The keyboard is the wrong tool for this job.

Developers are tactile learners. We expected people to look at the button labels. Instead, they memorized positions within minutes and started using the console by feel. The haptic feedback accelerated this — each action has a physical signature. This suggests that hardware interfaces for coding are massively underexplored.

Less is more with modes. Our first design had 6 modes. We cut it to 3 (Review, Chat, Explore). Fewer modes means less context switching, which means the developer stays in flow. Each mode has exactly the controls you need and nothing you don't.

The dial changes the relationship with AI suggestions. When suggestions are a keyboard shortcut away, developers tend to accept the first one. When they're on a dial you can twist through, developers actually browse alternatives. The physical act of "flipping through options" encourages better code selection. The medium shapes the behavior.

Shortcut mapping is a solved problem — shortcut discovery is not. Every developer knows Ctrl+C copies. Almost nobody knows their IDE's "accept AI suggestion" shortcut without looking it up. VibeDeck's real value isn't sending shortcuts faster — it's making AI coding actions visible and discoverable through physical buttons with clear labels.


What's next for VibeDeck

Auto mode detection. Use window title parsing and clipboard monitoring to detect when a diff is open, when chat is active, or when the user is in normal editing. Switch modes automatically without user input.

Inline diff preview on LCD. Render a miniature diff view directly on the LCD buttons — green lines for additions, red for deletions — so you can see the shape of a change on the console before looking at the screen.

Voice-to-prompt. Hold a button, speak your coding intent ("add error handling to this function"), and VibeDeck types it into the AI chat. Vibe coding without typing the prompt either.

Git integration. A fourth mode for git operations — stage hunks with the dial (twist to select, press to stage), commit with a button, push with another. The same dial-as-review-tool concept applied to version control.

Pair programming mode. Two developers, two consoles, one screen. Driver gets the Explore mode, navigator gets the Review mode. Physical role separation for pair programming.

Community shortcut profiles. A shared repository of IDE shortcut presets that users can download and import. "Cursor + Vim keybindings" or "Windsurf + custom layout" — one click to load.

Marketplace launch. Package as .lplug4, submit to the Logitech Marketplace, and bring physical vibe coding controls to every MX Creative Console owner.


VibeDeck — Where vibe coding meets hardware. ⌨️🎛️

Built With

  • actionssdk
  • c#
Share this project:

Updates