Inspiration In my freshman year of high school, I visited a deaf and hard-of-hearing school where I donated art supplies, and this project was inspired there. Communication with students via translator was deep. It brought out communication difficulties facing the deaf community. This meeting ignited my passion for computer vision and machine learning and as a result, I developed an app that can translate sign language into english.
What it does The Sign Language Translation App is an app for cell phones that does instant American Sign Language into English conversion for easy communication. Due to the presence of advanced computer vision and machine learning algorithms, the app translates sign language gestures into real language. The app is becoming broader by including Indo-Pakistani and Chinese sign languages; thus, it can be all-encompassing.
How we built it The first set of tools to be discussed were the essential tools of the project, Python and TensorFlow. The app uses a reliable blend of computer vision and machine learning models, trained and tested thoroughly to guarantee accurate translations. User feedback and rigorous testing guided iterative development. The interface of the app is simple; it is designed to be usable by anyone to make the technology readily available.
Challenges we ran into Among the main challenges was the accuracy of translation among sign languages, all possessing different niceties and subtleties. Variability in the execution of signs on different people increased the problem. Collecting and annotating a large and wide range of datasets used for the training of the model was quite challenging. Enabling real-time performance, without sacrificing precision, on mobile devices was also a challenge.
What we're proud of We appreciate the development of this app despite the challenges. The app is showing the potential impact and relevance. The humblest of our achievements is the knowledge that this tool has the potential to significantly enhance communication for those who sign.
What we learned Learning Python and TensorFlow, as well as mastering the art of sign language interpretation has been fun. As we have learned about deaf issues and on how technology is promoting inclusiveness. Feedback and interest by the community have taught us the importance of user-centric design as well as continuous improvement and adaptation.
The Sign Language Translation App, what comes next? The prospects for the Sign Language Translation App look good. At the moment we are improving the models by including the Indo-Pakistani and Chinese sign language to make the app more useful and accessible. We also enhance the app’s precision and user-friendliness based on input from the early adopters. We want to include additional sign languages and perhaps bi-directional communication to make the app a fully universally functional sign language translator. I think this app can be used to teach sign language as well. Our mission is just beginning and we are here to make this world more equal one step at a time.
Built With
- android-studio
- google-vision
- java
- keras
- tensorflow
Log in or sign up for Devpost to join the conversation.