After taking an ASL course, I realized there were a lot of workarounds deaf and hard of hearing individuals had to do in order to communicate with others who may not understand sign language. Sure, they can always write down the message, but this only works if both the sign language they understand and the language the other person speaks (i.e. American Sign Language and spoken English) is the same/compatible. What if someone who only knew ASL wanted to communicate with someone who only knew French? It becomes a slightly more difficult situation.

What it does

So our solution was to create a program that would track one's hand motion in space and utilize machine learning to categorize it as one of a number of sciences, and using natural language processing, derive meanings of the set of signs and allow them to be translated to a number of other spoken languages.

How we built it

It is built using a leap motion to track the person's hand motion in space. Using a neural network trained on ASL signers, we build up a network that is able to classify each of the hand motions as specific signs, which we string together to derive a single meaning using the natural language toolkit in python

Challenges we ran into

-- Training the model on the test data. Since there is a large variation in the way things are signed, getting it super accurate will be a big issue -- Processing the meaning since many ASL signs are actually made up of similar ASL signs (similar to how some Chinese words are made up many characters of more simple words)

Accomplishments that we're proud of

What we learned

What's next for Sign Language Translator

Probably moving from

Share this project: