AI, machine learning, smart systems: these are the buzz words of the tech scene right now, so it was only logical that we wanted to head in this direction. We came upon the Leap Motion and were amazed by its accuracy in motion tracking. The first thing to come to mind to our minds with this technology was sign language. After researching online, the only translation resources were images of ASL alphabet letters rather than the words they describe, and we knew we could do something to make it more accurate, more intuitive, and most importantly, more fun!
What it does
Our American Sign Language Translator has the capability to convert speech into an animated pair of hands that displays the corresponding ASL signs. Inversely, ASL Translator is able to recognize ASL signs and write its corresponding text to the screen.
How we built it
Challenges we ran into
Accomplishments that we're proud of
After continuously butting heads with the LeapJS in front of us, we were unsure if were were even going to get a anything output. We were ecstatic to be able to release the initially intended text to ASL translation, with an added speech recognition and reverse translation as well!
What we learned
1) node.js (kind of)
2) The ability to try and understand and reconstruct code structure through analysis and documentation
3) Working on a project without the assistance of stack overflow (as there were seldom threads for the technology used)
4) Pay someone else to setup your website for you; it's not worth your time and effort
What's next for American Sign Language Translator
If we were to continue this project, we would try and add addition hardware to exponentially increase the accuracy of the solution; something like a Myo Band or smart watch with accessible accelerators would make it easy for us to recognize gestures that the Leap Motion cannot see. We would definitely also implement a neural network system to continue better the accuracy of motion instead of only relying on the taking the initial training set.