Inspiration

The DevStudio Challenge asked for a "Genie in a bottle," but most AI integrations today are essentially amnesiacs. When a developer highlights a bug in Cursor.IDE or XCode and sends it to ChatGPT, the AI lacks the context of the rest of the codebase. It requires massive, manual context switching to copy-paste logs, dependencies, and file structures.

We realized the future of hardware-integrated AI isn't just sending text to an API; it is triggering Agentic Workflows. We were inspired by the Model Context Protocol (MCP) to build FlowState, turning the Logitech MX ecosystem into a physical transmission for autonomous AI agents that can securely "see" and interact with your local development environment.

What it does

We mapped three distinct, context-aware Agentic personas to the physical inputs of the MX ecosystem:

The Agentic Coder (MX Creative Console Dial): Highlight a function in XCode and roll the smooth dial right. Instead of a blind rewrite, FlowState triggers a coding agent. Using MCP, the agent securely reads your local project's style guidelines and related dependencies, then dynamically streams the refactored code directly into your IDE as you turn the dial.

The Debugging Agent (MX Console Keypad + Haptics): Highlight an error in your terminal and press the "Bug" LCD key. You don't wait for a chat window. Our debugging agent autonomously uses MCP to read your local crash logs and environment variables. Once the agent formulates a verified fix, the MX Master 4 delivers a haptic pulse, and the exact terminal command pops up in the Actions Ring, ready to be executed.

The Comprehension Agent (MX Master 4 + Actions Ring): Highlight a dense Jira ticket or local PDF and click the thumb button. The agent reads the local file context and instantly overlays an actionable, bulleted summary natively at your cursor via the Actions Ring UI.

How we built it

To make this work seamlessly across locked-down IDEs and apps, we built a local Agentic Orchestrator.

The Trigger: The Logitech Actions SDK registers a physical button press or dial turn.

Context Capture & MCP Host: Our local background service captures the user's highlighted text. More importantly, it acts as an MCP Host, maintaining secure, standardized connections to local resources (like a git repository or local log files).

Agentic Routing: The service sends the user's prompt along with the MCP-provided local context to the LLM (e.g., Claude 3.5 Sonnet, which natively supports MCP).

Autonomous Execution & Tactile Return: The AI agent reasons through the problem using the provided local data. The resulting insight is piped into the Logitech Actions Ring overlay, or injected directly into the active application.

Challenges we ran into

Integrating the Model Context Protocol with a hardware SDK was uncharted territory. Managing the asynchronous nature of Agentic AI—where an agent might take 5 to 10 seconds to read local files and formulate a plan—felt sluggish at first. We solved this by leaning heavily into the MX Master 4's hardware haptics. By using tactile feedback to signal when an agent is "thinking" and when it is "done," we completely eliminated the need for loading screens, preserving the user's flow state.

Accomplishments that we're proud of

We are incredibly proud of achieving true Zero Context Switching. Bypassing OS sandboxing to create an invisible, universally compatible clipboard bridge allowed us to bring Agentic AI to notoriously closed environments like Apple's XCode. Furthermore, successfully marrying the physical haptic engine of the MX Master 4 to the asynchronous "thinking" phase of a cloud-based AI agent created a tactile UX loop that feels like genuine magic.

What we learned

We learned that hardware-to-cloud latency is the ultimate killer of user experience. We had to dive deep into API streaming to ensure that as a user turns the MX Creative Console dial, the code refactoring appears instantly on screen, rather than waiting for a bulk JSON response. We also mastered the intricacies of the Model Context Protocol, learning how to safely expose local file systems to LLMs without compromising security.

What's next for FlowState Genies: The Tactile AgenticAI Co-Pilot

We plan to expand our MCP integrations to include enterprise databases (like Snowflake) and local Docker containers, allowing the MX Console keypad to physically deploy and monitor Agentic testing environments with a single keystroke.

Built With

Share this project:

Updates