Inspiration
Our references and inspirations combine technical resources and captivating gameplay mechanics. From Pip Turner's Live XR Course, which explores various hand interactions and provides design tips, to Dilmerv's video tutorial, offering comprehensive documentation for setting up the work environment, creating custom poses, and linking events. The forum dedicated to the XR Hands API complements these resources by enabling in-depth management of both hands.
As for inspirations, we drew from gameplay mechanics such as Zelda: Wind Waker for exploration, OverBoard, MarineVerse Sailing Club, and Rigging Drill for realistic manipulation, as well as A Fisherman’s Tale 1 for its immersive interactions. These games helped us refine our approach to natural and engaging interactions.
What it does
This immersive experience uses the player's gestures to guide the movement and actions of the otters, captains of their ship in the game. By using intuitive motions, you can help the otters cross obstacles, rescue their friends, and solve interactive challenges. The gesture-based interaction provides a smooth and engaging experience, creating total immersion in this enchanting world.
How we built it
We used Pip Turner's Live XR Course to learn how to manage hand interactions and Dilmerv's video tutorial to set up the environment, create custom poses, and link events. Initially, we integrated smooth interactions with both hands using the XR Hands API, but for optimization purposes, we transitioned to the Meta Interaction SDK.
Challenges we ran into
The main challenges were ensuring precise and fluid gestures, integrating custom poses and events without latency, and managing the movements of both hands in real time while maintaining performance. These adjustments required constant testing to deliver a natural and immersive experience. Additionally, recognizing the environment and the furniture in the room to randomly generate placements required numerous iterations and extensive testing.
Accomplishments that we're proud of
We are proud to have created an immersive experience where the player's gestures guide the interaction. We successfully made this interaction smooth, designed a captivating visual world, and delivered innovative gameplay.
What we learned
In Unity, we learned to use HandPose to orient the hand's position based on the environment. By transitioning to the Meta Interaction SDK, we gained expertise in gesture recognition and managing associated feedback. Finally, we apply events linked to these gestures, such as actions or interactions within the game, using input systems or triggers to activate behaviors in response to these movements. This approach enables the integration of precise and interactive gesture controls into the Unity application, making the user experience more immersive and intuitive.
What's next for The Otter Side
The next step is to enhance the gestures by integrating mechanics inspired by maritime maneuvering signals used for navigation at sea. We will develop specific gestures corresponding to maneuvering commands, such as directional and orientation signals, and pair them with light cues to guide the user. This approach will make interactions even more precise and immersive, replicating an intuitive and natural communication system inspired by maritime practices. Finally, we will diversify the objectives, obstacles, and other game elements to deliver a comprehensive and engaging gameplay experience.
Built With
- c#
- handtracking
- metasdk
- openxr
- unity
- visual-studio


Log in or sign up for Devpost to join the conversation.