Inspiration

What it does

How I built it

Challenges I ran into

Accomplishments that I'm proud of

What I learned

What's next for ASL-Alphabet-Transcriber

I was browsing Kaggle and I found a MNIST dataset for the ASL alphabet. I decided to create a live ASL to text translator using my computer's webcam.

Using Keras, I created a Convolutional Neural Network that was built and trained with the dataset I found online. Then I used opencv to isolate the hand gestures. I then inputted the isolated hand gestures into my CNN which returned the predicted value of the hand gesture.

I ran into constant challenges dealing with the long training times of my model, figuring out the documentation of opencv, as well as making sure every part of the code ran smoothly together.

I hope to expand on my code so that it can translate more ASL in the future. This could help individuals that do not understand ASL converse with those who mainly communicate through ASL.

Built With

Share this project:

Updates