What it does
It allows the user to see live subtitles of another person in conversation. It also allows you to use British Sign Language, and convert into text to speech.
How we built it
This unity application utilises the Meta XR SDK to enable passthrough, hand tracking, voice recognition and other key areas.
Challenges we ran into
The precise mapping of every single gesture was very difficult and took a long time. Additionally, the internal microphone of the Quest 3 was not sufficient to pick up another person from a distance, so we now use an external microphone.
Accomplishments that we're proud of
Learning more about the VR environment and how to develop for VR in Unity.
What we learned
Too much about the hand tracking and gesture system in the Meta XR SDK for Unity.
What's next for ENA Deaf Assistant
Recognition of other languages to subtitle, as well as furthering the sign language to words and actions.
Log in or sign up for Devpost to join the conversation.