Inspiration

Creative software has become incredibly powerful, yet interacting with it is still fragmented and indirect.

The Logitech MX Creative Console introduced physical controls with LCD buttons, but today they function only as static shortcuts that users must manually configure, remember, and constantly switch between.

At the same time, AI has become a powerful creative partner - but it exists only inside chat interfaces, disconnected from physical workflows and tools.

We asked a simple question:

What if AI wasn’t just something you read on a screen - but something you could physically see, touch, and instantly apply?

Loge was created to bridge the gap between AI intelligence and tangible creative control.


What it does

Loge transforms Logitech’s MX Creative Console from a static shortcut panel into a live AI interaction surface.

Instead of displaying fixed icons, the LCD buttons show real AI-generated results that users can instantly apply with one press.

With Loge, users can: • Ask for creative suggestions using voice or text • See AI outputs directly on the console buttons • Instantly apply colors, fonts, layouts, and effects • Analyze on-screen content for contextual actions • Generate dynamic tool buttons tailored to their workflow

Loge also integrates into the MX Master mouse Actions Ring, enabling seamless AI access without interrupting the creative flow.


How we built it

Loge is designed as an Actions SDK plugin concept built around three core systems: • Hardware integration through the Logitech Actions SDK • AI generation and reasoning APIs • Context detection based on active applications and screen content

We will designed a new interaction model where physical LCD buttons act as a real-time AI output interface, capable of displaying dynamic results instead of static controls.

The focus of development was not only technical feasibility, but redefining how physical hardware can communicate and execute AI-driven actions.


Challenges we ran into

The biggest challenge was rethinking the role of hardware buttons.

Traditional physical interfaces are designed for predictable, fixed actions, while AI outputs are dynamic and constantly changing.

Designing a system where meaningful AI results could be represented on small LCD screens - while remaining instantly understandable and actionable - required extensive UX experimentation.

Another challenge was ensuring that Loge remains non-intrusive, enhancing workflows without disrupting user focus.


Accomplishments that we’re proud of

We are most proud of introducing a completely new interaction paradigm:

Turning hardware buttons into a live AI output interface.

Instead of reading AI suggestions in a separate window and manually applying them, users can directly execute AI-generated results through physical controls.

This creates a seamless bridge between ideation, decision-making, and execution in creative workflows.


What we learned

This project showed us how powerful AI becomes when it moves beyond traditional software interfaces and integrates into physical workflows.

We learned that creators don’t just need smarter tools - they need faster and more intuitive ways to act on AI insights.

We also discovered that context-aware, non-intrusive AI assistance provides significantly more value than constant automated suggestions.


What’s next for Loge - Your AI-Powered Creative Agent

Our next step is developing a functional prototype using the Logitech Actions SDK and validating the interaction model with real creative professionals.

Future development will focus on: • Expanding integrations with major creative software • Enhancing personalized workflow learning • Improving real-time context awareness • Bringing adaptive AI controls to more Logitech devices

Our long-term vision is to make AI a tangible, physical layer of creative work - not just a software feature.

Built With

  • aftereffect
Share this project:

Updates