MoonBloom is a project developed at Ayzenberg's reality department. Our goal was to explore storytelling in the user's space. We were inspired by Mixed Reality's capabilities of both understanding the room and the user. The result was a gesture driven, spatially conscious story about a lonesome fox and a shattered moon.

What it does

MoonBloom is a narrative puzzle game that takes place in different areas of the user's space. It's powered by hand tracking so users are tasked with solving puzzles with their direct physical interactions. Puzzles are solved by lifting rocks, sliding logs, tracing constellations, and carrying pieces of the moon.

How I built it

MoonBloom was built by a small team of five and I acted as the Lead Developer and Interactive Designer. We built the project in Unity and incorporated platform specific Lumin and ARKit sdks to achieve spatial understanding. I approached development in a multi tier process. First, was the envisioning process where we paper prototyped the interactions and developed a sense of how the app should function. Next I moved into a iterative prototyping process as we solidified the interactions and spatial design elements. Currently we are in beta for our first level and I primarily focus on optimization and final polish aspects.

Challenges I ran into

The primary challenge encountered in MoonBloom was on boarding users who have never experienced Mixed Reality. New users are hesitant to move in their space and get physically invested in the puzzle solving. This was solved by creating an intro sequence where the users simply follow a fox around their room to get comfortable with moving. By tutorializing the mechanic of moving, users became more spatial conscious of their actions by the time puzzles were introduced.

Accomplishments that I'm proud of

MoonBloom is a digital adventure in your space and the app pushes what that means from an interaction standpoint. Fostering the idea of presence in the user is important for XR so interactions in MoonBloom are powered by natural gestures. On the Magic Leap, I designed and engineered a system where users can directly grab and move digital objects with their hands. On ARKit, I used 6dof tracking of the phone to create a digital hand for the users to move. These gestures gave users a direct sense of efficacy of their actions in this digital environment.

What's next for MoonBloom

MoonBloom is a multi chapter story with intentions of being released to the public in different sections. Along with my team, we plan to continue developing the remainder of the story and introducing more gestural based interactions.

Built With

Share this project: