Inspiration

The inspiration behind the idea of Hand Sign was to allow users to convert ASL to readable text. In which the age group of young children could greatly benefit from this implementation.

What it does

The project concurrently, has a classifier for the key joints of the hand it recognizes. Concurrently the detector can only detect one hand at a time. While using TensorFlow, to handle the deep learning when classifying the datasets to recognize the hand gestures recognition.

How we built it

The approach that I had in building this project was really deciding on the datasets to use and decided to go with the point_history_classifier.tflite dataset. Then after deciding this would be the dataset, that I would work with, I decided to first start developing the key point handler, and the point history handler. As these would represent the models for detecting the skeleton of the hand gestures that are shown when running the demo. Then moving forward to the main Python file, is handling visualizing the box labels, as well to also visualizing the point history.

Challenges we ran into

Challenges that I ran into were getting a smoother frame rate, as well as getting accurate information feedback from the Machine Learning model.

Accomplishments that we're proud of

the accomplishments, that I am happy about is being able to figure out how to get the hand recognition, to be accurate in regarding the very basic operations of what the hand is doing in the frame.

What we learned

What I learned was improving the way I generate my own dataset. As well, as improving how I also build machine-learning models.

What's next for Hand Signs Recognition

The next phase for Hand Signs Recognition, would be eventually allowing to continue developing more of the ASL portion of the code, and more testing on the interactions between the user and the application. Regarding accuracy in getting information from the user that would be in sign language.

Built With

Share this project:

Updates