RhythmForge VR — Project Description

DevStudio 2026 by Logitech | MX Ink (MR Stylus for Meta Quest) Category

Project Story

About the project

RhythmForge VR started from a simple question: what if making music in VR felt less like operating software and more like drawing, conducting, and shaping sound with your hands?

How we built it

RhythmForge VR was built in Unity for Meta Quest, using the Meta XR / OpenXR stack together with the Logitech MX Ink stylus interaction profile. The stylus became the center of the experience. Instead of pressing buttons to place notes on a grid, the user draws one stroke in 3D space, and the system analyzes that shape to derive musical behavior.

To make that work, we built the app around several connected systems:

  • real-time stroke capture in 3D space using the MX Ink stylus
  • shape analysis that extracts features like contour, symmetry, angularity, and tilt
  • a guided composition model that maps each stroke to one musical phase
  • genre-aware derivation rules for Electronic, New Age, and Jazz
  • a playback layer that combines Harmony, Melody, Groove, Bass, and Percussion into a single coherent loop

One of the biggest architectural shifts was moving away from a more free-form pattern model toward a guided composition system with stronger musical guarantees. In the current version, the app starts from a stable musical foundation and lets the user build one layer at a time. Each stroke is expressive, but it is also interpreted through rules that keep the result in key, aligned to the harmony, and structurally listenable.

What we learned

The biggest lesson was that creative freedom works best when it is supported by invisible structure.

At first, it was tempting to think of musical expression in VR as pure openness: more controls, more gestures, more ways to manipulate sound. But in practice, especially for new users, too much freedom quickly becomes confusion. We learned that the most satisfying experience came from giving the user a strong musical framework and then letting gesture shape the variation inside that framework.

We also learned that hardware-specific design matters. The MX Ink is not valuable here just because it is new hardware; it is valuable because its form factor changes how the interaction feels. A stylus invites drawing, tracing, and sculpting in a way that standard VR controllers do not. That influenced both the interface design and the musical logic of the app.

On the technical side, we learned how important it is to align architecture with experience design. Once we shifted to the phased composition model, the codebase also had to change: domain types, guided defaults, re-derivation flows, phase state, and UI all needed to reflect the same mental model. The refactor was not just cleanup; it was a way of making the product more truthful to its own idea.

Challenges we faced

The hardest challenge was balancing expressiveness with musical correctness.

If the system followed the drawing too literally, the output could become chaotic or unpleasant. If it constrained the drawing too aggressively, the app stopped feeling creative. We had to iterate on that middle ground repeatedly: deciding which parts of a stroke should influence melody, which should affect groove, how harmony should remain stable, and how much variation percussion could introduce before losing the beginner-safe feel.

Another challenge was scope. The original pitch imagined a very broad spatial music creation environment, but a hackathon project needs clarity. We had to decide what was essential to prove the idea. That led us to focus on the strongest version of the concept: guided composition through shape-driven musical phases, with the MX Ink stylus as the core interaction device.

We also faced the practical challenge of making multiple systems cooperate in real time inside VR: stylus input, stroke capture, UI interaction, guided state management, pattern replacement, genre switching, and musical re-derivation all had to feel stable and responsive enough to support a creative flow.

Why this project matters to us

RhythmForge VR represents more than a music tool. It is our attempt to explore what native creative software for mixed reality can look like. Rather than porting desktop workflows into a headset, we wanted to ask what becomes possible when composition starts with motion, gesture, and space.

The result is a project that sits between instrument, interface, and composition assistant. It shows how the Logitech MX Ink can unlock a genuinely new form of creation in VR: not only writing or drawing in space, but composing with it.

Built With

Share this project:

Updates