Inspiration

Design tools are disconnected from physical space. We wanted to eliminate the gap between imagination and environment by turning real-world surfaces into an editable canvas.


What it does

Reality Composer lets users draw directly onto walls, floors, and objects in mixed reality. AI interprets sketches and converts them into structured, interactive 3D elements anchored to real space.


How we built it

We designed the concept around OpenXR spatial tracking, Meta Scene Understanding, and an AI-powered semantic layer that translates strokes into 3D objects using Unity as the rendering engine.


Challenges we ran into

The biggest challenge was balancing ambition with feasibility—ensuring the AI interpretation layer remains believable while keeping the interaction simple and intuitive.


Accomplishments that we're proud of

We created a clear, high-impact spatial interaction model that goes beyond whiteboarding and demonstrates a new way to author digital content in the physical world.


What we learned

Precision input dramatically changes how users think in mixed reality. When drawing becomes natural, spatial creativity increases significantly.


What's next for Reality Composer

We plan to prototype core spatial drawing and semantic object generation, validate with design professionals, and expand into enterprise collaboration modules.

Built With

  • mx
  • openxr
Share this project:

Updates