Inspiration

Learning languages is a highly-demanded skill in our world today. We need languages to communicate with people all around the world and get our ideas across. However, some individuals do not have the ability to speak, which is why different forms of communication are required. Despite sign-language existing, not many people know how to use it. Thus, with the topic of education and Artificial Intelligence, we were inspired to make an application that would assist everyone to learn sign-language on their own, at the comfort of their homes. This application is interactive and suited for all age groups as well.

What it does

This application uses a form of Artificial Intelligence and Image Recognition to recognize different letters in American Sign Language. The program will ask the user to display a letter into the webcam, and then the program will tell you if you did it correctly. As you continue to practice the letters, you will get more and more fluent with sign language.

How we built it

While approaching the topic, we needed to create a step by step method to create the entire project. We began by setting up an algorithm that displayed the user’s webcam on the screen. We then created a way to display the images from the user’s webcam in a way that only skin color would be detected, and the rest of the image would remain black, this allowed the computer to detect the hand making the hand signs. We implemented this into an app and formatted it accordingly. We then trained an AI using 2-dimensional convolutional neural networks to detect and predict images of hand signs using a database found online. We refined the training process to ensure we had a high accuracy, and then implemented the AI into the app with the webcam. We then coded the app to display a certain letter with hand signs, and it would detect whether the user displayed the correct hand sign. We finished by decorating the app to improve aesthetics.

Challenges we ran into

Firstly, the concept of teaching the AI to recognize pictures was a challenge, as it was brand new to all of us. Properly figuring out how it works and implementing it for recognition of sign language took time and was quite difficult to do at first. It was our first time seeing concepts such as convolutional neural networks so understanding them and then being able to use them was a big challenge. Furthermore, in terms of the UI, we found it challenging to implement the webcam feature and have a button link to opening the webcam the way it was intended.

Accomplishments that we're proud of

One of our accomplishments was successfully training the AI to recognize the different images we showed it, which in this case were sign language letters. We had to learn how to use convolutional neural networks from scratch, so being able to learn and then implement it so quickly was a great achievement for us. Another accomplishment was connecting the UI to the AI we created, as we had to learn how to link the two together so they would work properly.

What we learned

This was our group's first time learning different concepts of Artificial Intelligence and UI. We learned how to set up a webcam and use image recognition to recognize the hand gestures inputted by the user. We also learned what Convolutional Neural Networks were including 2D Convolutions, Max Pooling, and other associated features. Additionally, we learned how to train the computer to be able to predict the letters the users has showed to the webcam. In terms of the UI, we learned how to use Tkinter and and connect all the AI code with the UI code. We also learnt how to use systems that connect many libraries together.

What's next for Sign-IT

In terms of next steps, Sign-IT will be introduced soon with a sentences feature that allows you to make your own sentences with the sign-language you have learned! Not only this, but Sign-IT will be introduce with a feature that includes various signals special to sign-language that mean different idioms. These updates will be focused on first as they are necessary to learn for one to understand the sign-language culture. Further on, this application will include a chat box and video-calling feature for users to interact with each other through sign-language. There will also be many games with points and prizes through sponsors, for users to play and establish a friendly community.

Built With

Share this project:

Updates