Inspiration

Do you have a friend or relative who speaks sign language, and would like to learn it in a fun way? Or are you a person with deafness and would like to communicate more easily with your VR friends?

When we develop VR applications, or multiplayers VR worlds, we try to think of every user, we try to simplify the interfaces so that everyone can understand, but we always stick to most people. And there are some minorities who would like to use these technologies like everyone else, but because of a disability, use is more difficult.

That’s why I came up with the idea of integrating sign language into the XR universe using hand tracking.

What does it do?

For those who want to learn the sign language we can imagine an XR/MR scene, with an avatar standing right in front of the user. This avatar would teach the user some basics of sign language and the user would have to repeat the same hand movements (as in dance games, for example). There could be scenarios between the avatars and the user, to simulate real conversations. The user can follow daily lessons to earn daily points.

How would it works?

To detect the hand gestures, you need to enable the hand tracking mode in the Pico 4 headset. This project idea can be developped with Unity using the PICO Unity Integration SDK, and following instructions about how to set-up the VR hands and how to use them. With the prefab HandPoseGenerator, we can create various hand poses, and add actions to do when the user makes the pose. There are so many bones in the VR hands that it is possible to create enough poses to speak the sign language. However, the sign language isn’t just a hand pose, it is most of all a combination of different hand poses with often a movement (for example, to say “Thank you” it is a circular movement of the hand from the chin to the person in front of you). To detect if the movement is well done, it’s like in a dance game when we detect the dance movements of the player, with position error interval.

What do we have to be aware of?

It is important to notice that sign language is not universal. There are around 300 different sign languages around the world, so we might consider using maybe an AI to analyse videos of people speaking their national sign language and automate the creation of 3D sign dictionaries.

What's next for Sign Language integration with hand tracking?

We could imagine implementing this feature into the Pico Worlds project. Imagine you are a deaf person, and you want to communicate with other players in this multiplayer world, but you can’t because of the language barrier. Or in reverse, you want to speak with your VR friend but can’t understand him/her. This problem can be solved by a voice that speak over the hand gestures (Moves to Speech) or real time text subtitles.

Built With

Share this project:

Updates