FlowDeck — The brain behind every button.
Inspiration
We kept watching the same pattern: a creative professional buys an MX Creative Console, spends an evening setting up profiles for Photoshop, Premiere, VS Code, Zoom — and then never updates them again. The profiles go stale. Half the buttons do nothing in half the apps. The device that was supposed to remove friction becomes another thing to manage.
We asked a simple question: what if the console just figured it out on its own?
Not a smarter profile switcher. Not a template library. An actual brain — one that watches what you're doing, understands the context, and reconfigures every button, dial, color, and haptic response in real-time. No setup. No profiles. No manual switching. You open an app, and the console is already there waiting with exactly what you need.
That's FlowDeck.
What it does
FlowDeck turns the MX Creative Console into a living, adaptive control surface powered by AI.
A lightweight agent monitors your active application and system state. When you switch from Premiere to a Zoom call, the console transforms instantly — editing tools vanish, mute/camera/screen-share appear with color-coded LCD screens, and the dial becomes a volume knob. Open Spotify? Playback controls with vibrant colors fill the keypad. All automatic.
But FlowDeck goes far beyond productivity shortcuts. You can speak to it:
- "Give me a drum pad" — eight velocity-sensitive pads appear, each with unique colors and haptic feedback on every hit
- "Simon Says game" — the LCD grid becomes a game board with flashing color sequences, score tracking, and haptic buzzes for right and wrong answers
- "Pomodoro timer" — a countdown with animated color transitions, haptic alerts when time's up, and dial control for adjusting duration
Every button press, dial rotation, and touch gesture feeds back to the AI in real-time over WebSocket. The agent remembers game state, tracks scores, responds to toggles — it's a fully interactive loop. The console isn't just displaying information, it's running live applications.
18 SDK tools are exposed to the AI: per-key LCD background colors, multi-state toggle buttons, touch gesture handlers (long-press, double-tap, swipe), all 15 haptic waveforms, LED indicator control, screen animations, multi-page layouts, icon templates, and persistent settings.
How we built it
FlowDeck is a two-part system:
C# Loupedeck Plugin — built on the Logi Actions SDK. A ContextDetector monitors the foreground window every 2-3 seconds. An AgentBridgeClient sends context snapshots to the local AI server. A DynamicFolder renders the AI's layout decisions onto the hardware using BitmapBuilder for colored LCD keys, ProcessTouchEvent for gesture handling, and PluginEvents.RaiseEvent for haptic waveforms. An ActionRegistry maps action IDs to keyboard shortcuts, app launches, and system commands.
Python AI Agent Server — built with Google ADK (Agent Development Kit) and FastAPI, running on localhost:8765. The agent uses Gemini 2.5 Flash with a detailed system instruction that teaches it the full hardware capabilities. 18 Python tool functions let the AI configure every aspect of the device. A WebSocket endpoint (/ws) streams live button/dial/gesture events from the dashboard to the agent and pushes updated layouts back in real-time.
Web Dashboard Simulator — a pixel-accurate replica of the MX Creative Console hardware rendered in HTML/CSS/JS. LCD touch keys with glass effects, a rotatable dial with counter-rotating center label, roller, corner buttons, LED indicator — all interactive. The bottom corner buttons are hardwired to voice input (speech recognition) and screen capture (sends a screenshot to the AI for visual context analysis). The dashboard connects via WebSocket for live event streaming.
The entire stack runs locally. No cloud dependency for core features. User data never leaves the machine.
Challenges we ran into
The Loupedeck NuGet package puzzle. The Loupedeck.PluginApi package isn't on nuget.org — it's provided through LogiPluginTool's local feed. We built the entire C# plugin against the SDK documentation without being able to compile until the tool is installed. Every API call, every override, every event registration was written from docs alone.
Gemini model deprecation mid-build. We started with gemini-2.0-flash which got deprecated during development. Had to switch to gemini-2.5-flash and adjust the agent configuration.
Making the AI actually creative. Early versions produced boring, repetitive layouts — the same mute/volume/play buttons every time. We rewrote the system instruction from scratch, teaching the AI to think like a UX designer: color theory, haptic waveform selection guides, game design patterns, instrument layouts. The difference was night and day.
Real-time interactive loop. Getting the WebSocket event stream right was tricky — we needed debouncing for dial rotation (400ms), a mutex lock to prevent concurrent agent calls from corrupting tool state, and careful separation between HTTP command responses and WebSocket-pushed updates so layouts don't double-apply.
The renderCornerButtons bug. Setting textContent on a DOM element destroys its child nodes. The corner buttons had child <span> elements for labels that kept getting wiped. Small bug, hours of confusion.
Accomplishments that we're proud of
18 SDK tools exposed to AI — we didn't just map buttons. We implemented multi-state toggles, touch gesture handlers, all 15 haptic waveforms, LED control, screen animations, icon templates, multi-page layouts, persistent settings, and dynamic folder control. The AI has the full power of the Actions SDK at its fingertips.
Live interactive applications on hardware. You can play Simon Says on an MX Creative Console. The AI tracks game state across button presses, updates the board, keeps score, fires celebration haptics when you win. That's not a macro pad anymore — that's a platform.
Zero-configuration adaptive UI. No profiles. No setup wizard. No template marketplace. You plug in the console and it works. Switch apps and it follows. This is what smart hardware should feel like.
Voice + Vision input. Speak naturally to reshape your console. Share your screen and the AI sees what you're working on for even smarter context detection. The two bottom corner buttons on the dialpad are dedicated hardware triggers for these — mic and screen capture, always one press away.
The dashboard simulator. A faithful hardware replica that lets you develop and demo without physical hardware. Every LCD key renders with real background colors, the dial physically rotates, haptic events show as banners, the LED glows and pulses. It's genuinely fun to use.
What we learned
The Logi Actions SDK is deeper than most people realize. Multi-state commands, touch event processing, haptic waveform mapping, icon templates — there's a full application platform hiding in there. Most plugins barely scratch the surface.
AI agents need constraints to be creative. An open-ended "do whatever you want" instruction produces mediocre results. But give the AI a detailed capability map, color theory guidelines, haptic selection tables, and concrete examples — and it starts designing experiences that surprise you.
WebSocket makes hardware feel alive. The difference between "press button → see result next time you send a command" and "press button → instant reaction" is the difference between a tool and an experience.
Building against an SDK you can't compile teaches you to read documentation very carefully.
What's next for FlowDeck
Real hardware testing. The C# plugin is built and structured correctly but hasn't been compiled against the actual Loupedeck.PluginApi package yet. First priority is installing LogiPluginTool, resolving the NuGet package, and running on a physical MX Creative Console.
Persistent mode memory. Right now the AI starts fresh each session. We want it to remember your preferred layouts per app — learning over time that you always want the timeline scrubber on dial slot 0 in Premiere, or that you prefer warm colors for music apps.
Plugin runtime system. We prototyped a system where the AI can write and deploy Python backend code at runtime — timer loops, game engines, API integrations. The foundation (
plugin_runtime.py) is built but not yet integrated. This would let FlowDeck create truly autonomous applications: a Pomodoro timer that counts down on its own, a stock ticker that updates live, a notification hub that pulses the LED when emails arrive.Community mode sharing. Let users share their favorite AI-generated layouts as "mode seeds" — short text prompts that reproduce a specific experience. "Drum pad with jazz colors" or "Pomodoro with 25/5 intervals" become shareable one-liners.
Multi-device orchestration. MX Creative Console + MX Master 4 working together — the mouse's haptic feedback synchronized with the console's LED and button states. The AI coordinates both devices as a single experience.
Logitech Marketplace submission. Package FlowDeck as a production plugin with proper installer, settings UI in Logi Options+, and onboarding flow. Target: available for every MX Creative Console owner.
Built With
- actionssdk
- geminilive
- mxcreativeconsole
Log in or sign up for Devpost to join the conversation.