About the project

In today's virtual reality, many interactions still rely on controllers and buttons that add friction, make simple tasks difficult, and create a barrier to entry for thousands of users. Leandro and Gustavo propose a change to Kinesix: allowing people to use their hands as they do in real life, with a credible, stable, and physically coherent grip. Our project explores controller-free interaction, where the user touches, holds, and manipulates objects with natural physical behavior, achieving a more comfortable, intuitive, and accessible experience for everyone.

Inspiration

The project arose from two real needs:

Rehabilitation patients who cannot use controllers but can move their hands and need direct, simple, and human interaction.

Everyday users who want VR experiences without a learning curve, where grabbing an object is as natural as reaching out your hand.

We were inspired by the idea that VR should be brought closer to everyday life, not the other way around. And that immersion is built from simple things: a natural gesture, an object with weight, a stable grip.

What it does

Kinesix XR offers a bare-handed experience based on intuitive gestures and realistic physics:

Full controller-free mode: navigation and manipulation using only your hand.

Robust gesture detection: pinch, grab, hold, push, drag.

Realistic physics: objects with believable weight, friction, mass, and inertia.

Stable collisions and contacts: stack, rotate, and position without vibrations or artificial behaviors.

Minimalist visual cues: a subtle visual language that reinforces the perception of touch (hover, tap, grab).

Smooth interaction: natural transitions between states (tap → grab → release) even if tracking is not perfect.

In short: grabbing in VR feels like grabbing in real life.

How we created it

Engine: Unity 3D

SDK: Meta XR SDK (Quest Hand Tracking)

Hands and avatar: hand model with finger and forearm colliders for believable interactions.

Grasping system: elastic joints calibrated for stability, preventing vibrations and flickering in gestures.

Advanced physics: adjusted physical materials, consistent friction, realistic relative masses, and continuous collisions to prevent clipping.

Optimization for naturalness: microtransitions to compensate for tracking losses, reducing sudden drops and maintaining a feeling of solidity.

Everything is designed with one guiding principle: every gesture should feel natural, intuitive, and physically consistent.

Challenges we ran into

Occlusions and loss of tracking: when the hand disappears or crosses over, the grip can be lost. We created a system that cushions these moments, maintaining the stability of the object.

Physical consistency between objects: adjusting masses and friction so that the virtual world “makes sense” and does not generate strange behaviors.

Grip stability: ensuring that objects do not shake or pass through the hand, even with imperfect tracking.

Complex interactions: allowing small objects to be stacked, rotated, and manipulated without breaking immersion.

These technical challenges forced us to iterate, test, and calibrate each physical parameter until we achieved a stable experience.

Accomplishments that we're proud of

Interaction that feels real: users intuitively perceive weight, inertia, and friction, making object manipulation feel natural from the very first attempt.

Stable grip even with imperfect tracking: smooth transitions and calibrated joints prevent vibrations and cuts.

Increased accessibility: we eliminate the need for controllers, opening the door to non-technical users and patients who cannot hold controllers.

Polished, ready-to-use experience: navigation, manipulation, and visual feedback work smoothly and consistently.

What we learned

We learned that physics is as important as tracking. Even with good hand tracking, the interaction does not feel natural if the object behaves with incorrect friction, inconsistent weight, or imprecise collisions. The key is to find the balance between:

• naturalness

• physical response

• stability under imperfect tracking

When all of this converges, the user stops thinking about the technology and simply acts.

What's next for Realistic physics and advanced hand tracking.

1- Deepen intelligent physics

• Improve virtual force detection.

• Enable two-handed gripping with heavier weights.

• Dynamic friction and inertia adjustments based on gesture.

2- Volumetric frequency map of the hand (3D heatmap + time) We will develop a new analytical tool in VR: a 3D volumetric heatmap + time that records:

• finger and hand positions

• trajectories

• speeds

• derived accelerations

• most used areas of the interaction space

This will allow us to optimize gestures, colliders, and physical parameters using real usage data, taking the naturalness of the grip to the next level.

Built With

Share this project:

Updates