Inspiration

The inspiration came from club involvement in ASL. It is very hard to learn ASL without having someone who already knows the language to correct your sign therefore we decided to make it easier for someone to learn ASL without another person.

What it does

This program is designed to prompt the user a letter to sign and then the user makes the sign to the best of their abilities. If it is not within a reasonable confidence interval, it does not allow the user to continue to the next letter. If the letter is wrong, the letter predictor tells the user what letter they are signing (incorrectly)

How we built it

We built the Sign Language Tutor using Python, OpenCV, and a convolutional neural network (CNN) trained to recognize American Sign Language (ASL) alphabet gestures. The system captures real-time video from a webcam, processes frames to predict hand signs, and compares them against a randomly selected target letter. Feedback is provided to help users practice and improve their signing accuracy. The project is modular and built for easy future expansion, such as adding more signs or integrating speech output.

Challenges we ran into

We ran into a challenge with the image formatting and API integration

Accomplishments that we're proud of

We are proud that we have a product to present although some extra time would be nice!

What we learned

We learned sign language and how to leverage AI to make our programs better

What's next for Sign Language Tutor

What the programmers at Sign Language Tutor want to do next is go mobile and port to iOS and Android to make it more accessible

Built With

Share this project:

Updates