I currently go to RIT, home of the National Technical Institute for the Deaf, and during my first few days on campus one of the hardest parts of my transition was asking for directions or where things are when there was a good chance the person I was asking couldn't understand me without awkwardly typing into my notes app. I decided to use my limited knowledge of sign language in order to spare other new RIT students and help bridge the divide between hearing and the hearing impaired not only across campus but across the world
What it does
The Leap Motion Sensor goes over a front facing camera on a laptop or phone. It then reads the joint and fingertip positions of the user and translates them into text which can then be sent through an instant messenger on video chat or simply used to join a conversation
How I built it
In construction of this device, I attempted to use a lot of different peripherals in order to track the skeleton of the hand including the myo armband and an odd arduino setup. After settling on the Leap, I set to work taking down calibration settings for different letters of the alphabet in sign. I then set to work figuring out a priority ordering for letters, narrowing down calibration settings and working on error minimization for the Leap Sensor.
Challenges I ran into
The Leap sensor is not a precision piece of equipment, and it has been the root of all of my problems with this hack. Firstly, fingers cannot cross, meaning that there are many signs that look nearly identical to the leap although they are completely different. Secondly, since the leap is a light based sensor, it has to be re-calibrated with every shift in the light level be it the sun going up, the sun going down, the lights being dimmed or the lights being turned back on again. Lastly, I just don't know that much sign, and although I made good use of some friends in interpreting majors back home, it made the entire process more difficult overall.
Accomplishments that I'm proud of
I'm honestly proud I got anything done at all. This is my first solo hack, so it was extremely intimidating starting from scratch without a team behind me. Something else I'm particularly proud of was the way that I calculated the positions of the fingers relative to the orientation of the palm and each other, using key fingers in order to determine what letter is being signed
What I learned
I learned a lot about the importance of error compensation from this Hack, as well as working with coordinates within python.
What's next for SOS - Sign to Speech Converter
In future, I would like to add support for more signs as well as an automatic calibration function as well as the 4 letters that had to be left out due to limitations of the hardware.