Inspiration

Stopping to type something in VR completely breaks the flow of an application. Even with experience, it's hard to get past the stage of visually searching for keys on a virtual keyboard. The nature of virtual keyboards also means that the user is locked in place while using them, slowing down experiences even more.

What it does

VRChord is a hand tracked, position agnostic chording keyboard. All standard keyboard inputs can be typed with one or two fingers, and the user is free to move while typing.

How we built it

Created in the Unity Game Engine for Meta Quest 2. OpenXR is the only library used for development, making the code tightly integrated with the framework.

Challenges we ran into

Storing inputs is a major UX challenge. Since chords can be made up of one or two inputs, the program has to be able to detect whether the user is ready to send a one input chord, or to wait for a second input.

Accomplishments that we're proud of

VRChord is able to seamlessly interact with scene elements! Input fields can be selected with pinch and point gestures, proving its viability in other applications or as an inbuilt option.

What we learned

Hand tracking is hard. Getting gestures to work for every hand shape and account for the different resting positions of users was difficult, and involved overwriting OpenXR's default gesture recognition in favor of a custom one.

What's next for VRChord

Predictive typing, language support, expansion for even more gestures and configurations. VRChord is absolutely still a prototype, and the next step is to get it into a useable state for customers.

Built With

Share this project:

Updates