Inspiration
As student developers and creators, we use Generative AI daily, but we quickly developed "chatbox fatigue." Every time we were in the zone—deep in an IDE like Cursor or building a scene in Blender-we had to stop, break our flow state, copy-paste context into a browser window, and wait.
We realised that while AI is incredibly advanced, the way we interact with it is stuck in the past. We thought: What if AI felt like a physical instrument? What if we could sculpt code and design the way a musician tunes a synthesiser? When we saw the Logitech MX Creative Console and the MX Master 4, we knew we had the perfect hardware to bring AI out of the browser and directly into our hands.
What it does
Genie Keys is a plugin that transforms the MX Creative Console and MX Master 4 into a tactile, spatial control board for Generative AI. It stops treating AI like a chatbot and starts treating it like an integrated tool.
- The Contextual Lasso (MX Master 4): Using the Actions Ring, you can highlight a block of code or text and squeeze/click to bring up a fluid, radial AI menu right at your cursor. Flick up to Refactor, right to Explain, or down to Document. Your hands never leave the mouse.
- Genies in a Bottle (Creative Console LCDs): The display keys act as dynamic AI Personas that change based on your active app. In Xcode, they become The Bug Finder and The Documenter. In Blender, they switch to Lighting Genie and Texture Genie.
- The Iteration Engine (The Dial): Instead of settling for the first thing the AI generates, Genie Keys generates multiple variations. You use the smooth dial on the Creative Console to physically "scrub" through these AI variations in real-time, watching the text or design morph on your screen.
How we built it
We built the core plugin using the Logitech Actions SDK, creating a bridge between the physical hardware events and a local Node.js background service.
The backend constantly monitors the active window state to dynamically push new UI profiles to the Creative Console's LCD keys. When the MX Master 4's Actions Ring is triggered, our background service grabs the highlighted text from the clipboard and wraps it in a system prompt based on the flick direction.
To make the physical dial control the AI's "creativity" (Temperature) on the fly, we mapped the rotary encoder's physical rotation angle to the LLM API's acceptable temperature range . We implemented this using a linear interpolation formula in our backend to translate physical hardware ticks into API parameters seamlessly:
$$T(\theta) = T_{min} + \left(\frac{\theta}{\theta_{max}}\right)(T_{max} - T_{min})$$
Here is a simplified look at how we handled the hardware events in our backend:
sdk.on('dialTurned', (event) => {
let theta = event.rotationAngle;
let currentTemp = calculateTemperature(theta);
// Update the AI and push feedback to the LCD key
aiService.updateTemperature(currentTemp);
updateDisplayKey(event.deviceId, `Temp: ${currentTemp.toFixed(1)}`);
});
Challenges we ran into
Our biggest hurdle was the clash between cloud latency and tactile expectations. When a user turns a physical dial, they expect instant, zero-latency feedback. If you have to wait 3 seconds for an LLM API to respond with every tick of the dial, the hardware feels broken.
We solved this through Asynchronous Pre-generation. When you invoke a Genie via the Actions Ring, the backend doesn't just ask for one answer-it silently asks the AI to pre-generate an array of 5 distinct variations in the background. By the time your hand reaches the Creative Console dial, the array is cached locally. Spinning the dial simply scrubs through the local cache, making the physical-to-digital connection feel lightning-fast.
Accomplishments that we're proud of
We are incredibly proud of cracking the latency problem. Making the "scrubbing" interaction feel natural and buttery smooth entirely validates the concept of tactile AI. We're also really proud of how seamlessly we got the Actions SDK to hand off context from a micro-interaction on the mouse directly over to the macro-controls on the Creative Console without the user ever needing to look away from their screen.
What we learned
We learned that hardware UX requires a completely different mindset than software UX. A digital button can afford to have a loading spinner; a physical dial cannot. We levelled up our skills in state management, asynchronous JavaScript, and prompt engineering. Most importantly, we learned that the next big frontier in AI isn't just training larger models—it's building better, more human-centric interfaces to control them.
What's next for Genie Keys
We want to expand Genie Keys far beyond text and code manipulation. Our next step is to build direct WebSocket integrations with tools like ComfyUI and Blender. Imagine using the Actions Ring to select a 3D object, and then turning the Creative Console dial to physically scrub through different AI-generated textures applied to that object in real-time.
Built With
- actions
- active-win
- api
- clipboardy
- javascript
- logitech
- node.js
- openai
- sdk
- typescript
Log in or sign up for Devpost to join the conversation.