Communication connects people by allowing them to convey messages to each other, to express their inner feelings, and to exchange thoughts, either verbally or nonverbally. However, people with hearing impairments are incapable of communicating verbally. As such, the debut of sign language was designed to assist hearing-impaired communities to express their feelings to others. However, sign language is entirely different from spoken language: it has its own grammar and its own manner of expression. Unfortunately, learning and practicing sign language is not common among society; hence, the aim of the project is to develop a sign language recognition model using the Leap Motion Controller dataset based on the American Sign Language(ASL).
The Leap Motion Controller is a low-cost and palm-sized portable peripheral device which is specifically designed to track hand and finger motion with high precision in the 3D Cartesian coordinate system. Leap Motion Controller is widely used in different areas, such as gaming, device control, interactive art, virtual reality, and other fields.
How I built it
Our team developed an ASL translator with 96% accuracy utilizing Neural Networks on the Leap Motion ASL dataset, and also experimented with other Machine Learning algorithms like Random Forest, KNN, SVM, Logistic Regression, and Naive Bayes
Accomplishments that I'm proud of
We achieved an accuracy of 96%!
What I learned
We learned about the implementations of various Machine Learning algorithms and how they can be applied to real-world use cases.
What's next for Sign Language Translator
This project presents an American Sign Language recognition system which involves 18 words using the Leap Motion Controller ASL dataset. A total of 18 features were adopted in the study and from the 428 features after performing PCA. We conclude that Neural Networks gives the highest possible accuracy of 95.8%, followed by Random Forests( 95.0%), SVM(90.8%), KNN(79.7%), and Multinomial Logistic Regression(78.1%), and the least is for Naive Bayes(61.7%).
Other future works would consider expanding the sign language recognition system to word and sentence-based recognition, as well as to other languages, instead of limiting recognition to only ASL.