Inspiration

During my trip to San Francisco over the winter break, I witnessed a significant amount of people with hard of hearing struggling to communicate with other people on the street. Coupled with the encouragement of a friend with large enthusiasm towards linguistics, I decided to try my hand at an app to assist in the learning of American Sign Language.

What it does

SignL aims to educate users on the basics of American Sign Language. In the current build, users can practice the ASL alphabet with flashcards, multiple-choice problems, and a video feed with randomly generated characters to facilitate practice in the real world.

How we built it

The project was entirely built in the Unity Development Engine. However, python scripts and Tensorflow were used to try and conquer the largest challenge, something discussed in the next section.

Challenges we ran into

A large goal of the project was to incorporate ML to recognize the user's motions as they practiced their ASL. However, the unfamiliarity with Unity and its functionality with OpenCV first led to an attempt at establishing a WebSocket between a Tensorflow/Python server and the Unity game. Eventually I trudged through trying to utilize OpenCvSharp in Unity to recognize it. Given the time restraints, this was ultimately unfinished.

Accomplishments that we're proud of

I am exceptionally proud of managing to "speedrun" through the basics of Unity Game Development. C# was also a language that I hadn't touched in quite a long time, and the long night spent on this project was well worth it.

What we learned

Through the creation of SignL, I learned more about the Unity Game Development Engine and was able to compare it with the Unreal Engine I used at a different hackathon. Additionally, I was able to learn a little bit of UI design and revived my C# knowledge. While I don't think my future career will utilize these skills, I am grateful once again to have diversified my STEM knowledge just a bit more.

What's next for SignL

As discussed earlier, a language model to help detect and move the user onward in the "Signing" practice section would be the highest priority for SignL. Additionally, a diversified range of different topics and words included in the app would make the app more desirable. Lastly, a chatbot could be integrated as a translator between English and ASL to further encourage learning.

Built With

Share this project:

Updates