Inspiration
Humans connect, it's in our nature. However, there's one group of people who we may not be able to connect to; the deaf. Learning sign language is sometimes hard and sadly many people will not go through the effort to go through the work needed. The true indicator of a just society is witnessed in the way it adapts to those most in need. With HelpingHand, we've taken a major step towards achieving just that.
Using the Leap Motion controller, SDK and the power of Java, we've built an application that works with the controller to recognise American Sign Language.
What it does
It uses finger-point recognition in the Leap Motion Controller and then processes this data to produce a valid binding of American Sign Language, essentially translating motion to text.
How we built it
We built it using the Leap Motion SDK combined with mathematical models represented through Java code.
Challenges we ran into
Originally we were going to implement it to recognise British Sign Language, however this proved to be difficult as the controller does not accurately recognise two hands especially when the are connected.
Another added challenge is the normalisation of input data. We have recognised that all hands are different in terms of size so we've had to use a mean distribution on a set of manually-learnt hands to help us provide an initial grounding to the software.
We also found that the Leap Motion SDK is very fiddly and notoriously under-documented with respect to Java, which made it extremely difficult to get it working in the first instance.
Accomplishments that we're proud of
We are proud to say that the controller recognises letter characters of the American Sign Language.
What we learned
A lot about how motion sensors work and the data processing required behind it. Combining the two requires immense computing power; however, perhaps the most difficult aspect in this accomplishment is the process of normalisation. In fact, the motion detector required us to convert every part of a human hand in a vectorised render. Fortunately, our experience with Java made it simple for us to conceptualise this process, as we translated every part of the human hand into objects.
What's next for HelpingHand
We look to enable two hands recognition to support British Sign Language, as well as add a GUI for ease-of-use. However, this is only the immediate next step. The real scope of HelpingHand goes beyond video-calls with the ambition of creating a portable system of translation, thereby only requiring the motion detector and possibly connecting AirPods through an application where a user will hear the translation while communicating.

Log in or sign up for Devpost to join the conversation.