Inspiration
BeatBlockz was inspired by the idea of bringing real-world creativity into mixed reality. Traditional music tools often feel technical, intimidating, or locked behind years of experience. I wanted to design something that feels like play — where anyone can make music simply by reaching out and interacting with floating sound pads and patterns.
Having worked in the past on loop-based music systems, I’ve always loved the immediacy of playful music creation. Mixed reality finally makes it possible to turn an ordinary room into a musical playground, where rhythm, melody, and structure exist as tangible objects in space.
What it does
BeatBlockz transforms your room into a mixed-reality music studio. When you enter the experience, you are surrounded by a synthesizer and a pattern-based music sequencer floating naturally within your space.
At the center are eight circularly arranged pads or spheres. You hit them using virtual beat sticks to play notes on the currently selected instrument. The pads are mapped to the chosen musical scale and key, making it impossible to play wrong notes. An optional chord mode allows you to trigger full, harmonically correct chords with a single hit.
Behind the pads is a visual pattern grid representing musical bars and recorded notes. During playback, these notes animate toward the player — similar to Beat Saber, but instead of destroying notes, you create them. Notes are recorded live by hitting the pads while playback is running.
In addition to the pads and pattern visualization, BeatBlockz provides a complete sequencer UI, including:
- playback controls
- velocity and quantization
- song, block, and pattern management
BeatBlockz organizes music hierarchically:
- A song consists of multiple blocks
- Each block is a loop
- Each block contains multiple patterns, layered by instrument (for example drums, bass, or lead)
Patterns can use instruments from categories such as drums, bass, keys, strings, and more.
The interaction adapts to the instrument type:
- One-shot instruments (for example drums) use flat pads you hit
- Sustained instruments (for example synthesizers) use spherical pads — the sound continues as long as your beat stick remains inside the sphere
Blocks, patterns, and songs can be saved and loaded.
At this stage, a song consists of a list of blocks. A full arranger timeline is planned for the future.
BeatBlockz makes music creation accessible and playful, turning it into a game-like mixed-reality experience rather than a technical task.
How we built it
BeatBlockz was developed in Unreal Engine 5.6 using the Meta XR plugin for passthrough mixed reality. The audio system is built on MetaSound combined with the Harmonix plugin, extended with custom code to support live input, recording, and real-time playback.
Challenges we ran into
Originally, BeatBlockz was designed to use hand tracking for music input. Finger-operated pads are common in real-world music controllers, and bringing that interaction into mixed reality felt natural.
However, after prototyping with the hand tracking SDK, it became clear that latency was too high for live musical performance. Even small delays made the interaction feel disconnected from the player’s intention. As a result, I switched to controller-based input, using drum-stick-like interactions with virtual pads.
Even with controllers, latency remained a challenge. To make pad hits feel responsive and musical, I implemented movement prediction so that interactions align better with the player’s intent.
Another major challenge was working with MetaSound and the Harmonix plugin. Documentation is extremely limited, and most examples are designed around pre-existing MIDI files or editor-created patterns. To support live input and dynamically recorded patterns, I studied engine and plugin source code extensively and relied on trial-and-error experimentation.
Accomplishments that we're proud of
Despite joining the competition late, within less than four weeks I built a complete Meta Quest application from scratch that includes:
- a multi-instrument synthesizer
- a block- and pattern-based sequencer
- live performance and recording via virtual pads
- real-time pattern visualization
- song saving and loading
- a full user interface to control playback, recording, and sequencer elements
What we learned
One key lesson was that current hand tracking latency is still too high for satisfying musical performance. But failure is also learning.
More importantly, BeatBlockz demonstrated how natural and fun music creation can be in mixed reality, and how spatial interaction can lower the barrier to making music in entirely new ways.
What's next for BeatBlockz - Build Music in Mixed Reality
There is still a lot planned for BeatBlockz.
The most important next step is a song arranger. Currently, users create blocks and patterns, but only a single block can be played back. An arranger would allow blocks to be placed on a timeline and across tracks to form a complete song.
Next, instruments should become fully editable, with parameter changes such as volume or filter effects being recordable directly into patterns.
Beyond that, BeatBlockz could evolve further with:
- sharing songs with other users
- exporting songs as MP3 files
- song-driven visual effects
- a gamified play mode, where finished songs can be experienced as an interactive rhythm game

Log in or sign up for Devpost to join the conversation.