Inspiration

Startup founders constantly switch between building products and communicating value. While coding in VS Code or designing in Figma, they operate in “builder mode". But when sending investor emails or posting on LinkedIn, they must instantly switch to “communication mode".

I asked a simple question:

What if Logitech MX hardware could become a physical AI control surface for that transition?

Instead of opening multiple tools, adjusting prompts manually, or rewriting content repeatedly, what if a dial, a button, or a ring could directly control AI tone, format, and output?

That idea became VibePitch Command Console.


What It Does

VibePitch transforms the Logitech MX Creative Console into a context-aware AI command centre.

It simulates the Logitech Actions SDK layer and maps hardware interactions to AI workflows:

  • Dial → Adjust tone intensity dynamically
  • Button 1 → Generate pitch
  • Button 2 → Improve tone
  • Button 3 → Convert to LinkedIn post
  • Double Tap → Draft investor email
  • Ring Rotation → Control teleprompter speed

The system adapts formatting based on working context:

  • VS Code → Technical pitch
  • LinkedIn → Thought leadership tone
  • Gmail → Investor-ready email formatting

How I Built It

Architecture:

Logitech Dial / Buttons
→ Simulated Actions SDK Layer
→ Context Engine
→ Gemini API
→ Output Formatter
→ Reactive UI Panel

The hardware layer is simulated using a custom MxHardwareSim component that emits SDK-like events. These events dynamically modify AI prompts before sending them to Gemini 1.5 Pro.

The UI is fully reactive with zero page reloads and real-time event logging for transparency.


Challenges

The biggest challenge was designing a clean abstraction between hardware events and AI logic.

Instead of hardcoding prompt variations, I built a context engine that modifies prompt parameters based on:

  • Active context
  • Tone intensity
  • Selected output mode

This allowed the dial to mathematically influence the "professional intensity" of the generated text.

Another challenge was designing the UI to clearly visualise hardware-state simulation without overwhelming the user.


What I Learned

  • Hardware UX is about tactile feedback and predictability.
  • AI tools become significantly more powerful when paired with physical control surfaces.
  • Context-aware prompting dramatically improves output relevance.
  • Clean system architecture matters more than visual complexity.

Future Vision

  • Full Logitech Actions SDK integration via Logi Options+
  • Marketplace plugin distribution
  • Custom macro mapping for AI prompts
  • Multi-app live context detection

VibePitch reimagines Logitech MX hardware not just as input devices but as intelligent AI control surfaces for modern creators.

Built With

Share this project:

Updates