Inspiration

After several years developing for PCVR and native OpenXR, we wanted to have a look at how Meta's SDK has evolved over the years and see how we can push the boundaries of what it allows us to do. We love base-builders and RTS games, but they always live inside a flat map or XR experiences offer only tabletop versions. With HoloForge we asked: what if your actual room was the map?

What it does

HoloForge scans your room and turns floors, walls, tables and corridors into a tactical battlefield. You build and upgrade a modular military outpost and place towers on real surfaces. Flying enemies navigate around your scanned room mesh, attack your base in waves, and you respond with towers that run in parallel to real-time defense.

How we built it

We used Unity with URP and Meta’s XR/MRUK stack. The game runs on a custom fixed-step simulation layer for enemies, towers and buildings, completely separate from Unity's lifetime and backed by spatial grids, pooling and Burst-friendly data structures for performance. Room scanning data is converted into usable primitives (floors, walls) for faster lookups and quick interactions like spawning portals or getting crash position. For other, more demanding tasks like validating building placement and steering enemies, we use scanned room mesh. On top of that we added an event-driven architecture, UniTask for async flows, DOTween for UI/FX, and a palm-mounted UI with that follows your hand so all base controls are accessible without breaking immersion and are further enhanced by usage of microgesture input.

Challenges we ran into

  • Getting reliable room mesh data and making placement feel natural in unpredictable rooms.
  • Custom pathfinding and collision for flying enemies around arbitrary scanned geometry.
  • Designing UI that works both as a strategy control panel and as something you “wear” in MR.

Accomplishments that we're proud of

We are proud of pretty much everything that can be seen in the project, because it didn't even exist four weeks ago. If we were to pick specific highlights, it would be the following:

  • A full UX loop starting from room setup, to complete tutorial
  • Robust pooling and simulation that handles large battles fairly well considering the mobile device
  • Overall a game with great potential, one that we have not seen yet, is fun, engaging and promotes at least some movement around the room, that is ready for expansion and the rest of our feature ideas

What we learned

I cannot speak for the team here, but what I can say is that none of us really expected something so quickly put together to be this engaging from the development side as well as the player side. We have already established a timeline beyond the competition with additional features that have never been seen before.

And I personally understood the strength of Meta's SDK, which was my personal learning goal on the tech side. I wanted to see how it evolved over time and what state is the Meta platform in, as I haven't worked with it directly in over 4 years. What was the biggest lesson though, wasn't something we tried or we learned while working with the tech. For me, it was the journey with my friends. I planned on developing and submitting this game alone, but bringing my friends in on it made all the difference in the world. If it wasn't for them, it would probably be moving boxes with no colors and sounds. Who knows if I would even be writing this story if it wasn't for them.

What's next for HoloForge

  • Performance optimization
  • More buildings and upgrades
  • Game loop polish
  • Local coop
  • Possibly remote coop by merging the scanned rooms
  • Depth occlusion
  • Custom hand gestures
  • 'Action' VR mode where you are transported from god view to first person and join the fight

Built With

Share this project:

Updates