Inspiration
Ever since I started programming in my spare time, I have been interested in machine learning. The thing that has inspired me the most is how machine learning can be used to solve so many interesting problems. So, I decided to create an application that uses machine learning during Hack the North.
What it does
The application that I created uses a computers webcam to analyze a person's hand gestures in real time. Based off the ASL manual alphabet letter that the user gestures with their hand, a neural network, that has been trained using the MNIST sign language dataset, determines which letter it is.
How we built it
This application was built using python. The neural network that is used was programmed from scratch using only the Numpy library for manipulating matrices. The neural network was trained using the MNIST sign language dataset. After training, I built a simple application that reads input from a webcam and predicts the meaning of a gesture using the neural network in real time.
Challenges we ran into
The main challenge that I ran into was that the neural network was not as accurate with the webcam data as with the testing data from the MNIST sign language dataset. This was due to lighting and other factors that differed from the training data.
Accomplishments that we're proud of
This is one of the first times that I used a neural network in a programming project. I'm proud that I was able to program an application that uses a neural network that I programmed from scratch.
What we learned
I learned a great deal about how a neural network works and how it can be used to solve problems in the real world.
What's next for Sign Language Interpreter
In the future, I would like to try to improve the accuracy of the neural network by experimenting with hidden layers and convolutions.
Log in or sign up for Devpost to join the conversation.