V'RSignify

The "V'RSignify" project uses Meta Quest 3's hand-tracking technology to capture hand joint movements and translate American Sign Language (ASL) alphabetic letters into text. This project aims to improve communication and accessibility for deaf and hard-of-hearing individuals by displaying their signed letters as text within the VR environment. The hand-tracking data is accessed through Meta Quest 3’s API, which makes it possible to track joint positions accurately. Rule-based gestures were created to translate the user input as seamless as possible.

Inspiration Communication is a basic human right, millions of ASL users face daily communication barriers. Inspired by advancements in VR hand-tracking, we set out to create an inclusive tool to bridge this gap.

What it does V'RSignify translates ASL gestures into real-time text using Meta Quest3 VR, enabling seamless communication between ASL and non-ASL users.

How we built it We used Unity, Oculus SDK, and hand-tracking technology combined with a custom AI model for gesture recognition and real-time text output.

What's next for V'RSignify It has potential to be a useful app that helps people with disability. Ease the communication between people. Develop similar apps for new Meta Orion and Snap Spectacles.

Github repository: https://github.com/hojats7731/VRSignify

Built With

Share this project:

Updates