Inspiration

One of the main things that connects us across the team is that we believe Mixed Reality enables us to learn skills much faster through immersive experiences. We aimed to build an educational app where users are immersed in real-world scenarios. We wanted to pick a skill that is worth learning, and that can be learned much better and faster in mixed reality.

What it does

Our app teaches British Sign Language through an interactive tutorial, then immerses users in a simulated chess game where they control pieces using sign language. We chose chess because it's a globally recognized activity that uses simple letters (A-H) and numbers (1-8) to move pieces, making it an ideal framework for learning and practicing sign language.

How we built it

We built our app using Unity as a game engine along with the Meta Quest SDK - including Presence Platform for hand tracking and Passthrough. We used Figma for the design and the Unity Asset Store for the 3D assets. We trained our AI sign language recognition model using Python + PyTorch and converted it into ONNX format to optimize it and use it inside our app.

Challenges we ran into

We ran through several challenges across this experience, there are 2 that are worth noting:

  • AI/ML: We wanted to train our own AI model for sign language recognition since there is no SDK/open-source model to do this. We tested using Meta SDK pose tracker, but it was not accurate enough. Therefore, we needed to find a way to record sign language hand-tracking data, save them, and train our model. This in itself was a challenge, and integrating it into Unity was also not easy as there is a lack of documentation online
  • Meta Quest Link: We wanted to use the Meta Quest Link to develop and test our app faster directly on the Unity engine. We ran into many issues with it and spent a few hours trying to get it to work. We thought of not using it but knew that if we solved it, it would make us move much faster than without it. Ultimately, extensive trial and error and testing got it to work. We suspect the issue was related to the cable, though this hasn’t been confirmed. In the end, we successfully resolved the problem!

Accomplishments that we're proud of

We are proud of the way we worked together despite the challenges we faced. It was very smooth, and we had a great time. Each team member contributed equally, leveraging their strengths and learning from one another along the way. We are also proud of the execution itself, even though the idea was a little ambitious for just 48 hours, we managed to get a demo up and running by combining all of our skills.

What we learned

We've learned a ton through this process, 2 learnings worth noting:

  • AI/ML: We learned how to integrate custom AI models into Unity for real-time use. We discovered that Unity only supports models in the ONNX format, so we had to convert our models to this format before successfully implementing them in our app.
  • Sign Language: Through this exercise, we learned to use the basics of the British Sign Language. By building an app to teach it to others, we ended up learning the letters A-H and numbers 1-8.

What's next for HoloSign

We see significant growth potential for HoloSign, with several key opportunities:

  • Expand support to other sign languages, such as American Sign Language and French Sign Language, which are widely used and in high demand.
  • Introduce additional simulated environments beyond chess, like ordering coffee or entering a restaurant. These scenarios are crucial for enhancing user engagement.
  • Develop a multiplayer setting where users can practice and play together, along with the option to have a virtual professor teaching sign language in real time, providing a more interactive and collaborative learning experience.

Built With

Share this project:

Updates