This Space Intentionally Left Blank
As for the specific hack itself, it by itself doesn't have significant social impact but overall the trends of society and technology dictate that in the future multidisciplinary fields will come together to solve many problems we face today, and computer science/programming is a vessel for many of these fields to work through. Our simple app was a demo of an idea with greater potential to the deaf community as information and communication become easier with rapidly evolving technology, so too does it help those impaired by disabilities. Although complete American Sign Language is beyond the scope of the hackathon and the capabilities of a Leap Motion, we're still proud to have been able to achieve recognition of most of the alphabet in finger spelling format.
What it does
The user can make basic finger spelling gestures above the Leap Motion, to which the web application will display a visualization of the user's action and print out the corresponding letter.
How I built it
- The entire project is an HTML file, client side
- A Leap Motion for gesture detection and control
Challenges I ran into
- ^ Needless to say the above was a significant challenge.
- Getting the visualizer running on the webpage using Three.js and LeapJS.
- Thanks to the highly sophisticated and cutting edge technology of a $70 gadget, there were absolutely no issues at all in reading in clear and specific data. The Leap Motion naturally reads user input flawlessly.
- A major challenge our team encountered was to be able to read data (such as axis, directions, confidence, etc) from the Leap Motion at the same time to be able to run a visualizer of the Leap Motion input. We encountered a bug where trying to run the two in two separate loops caused it to crash. With the help of mentors and careful debugging however we eventually overcame this challenge (naturally missing a single line of code, go figure) and was able to proceed.
- A more theoretical challenge was once we solved the above issue, was how to use the data to recognize finger spelling gestures. The approach we took was setting a "confidence" threshold on the Leap Motion where as the visualizer may not match the user's hand input. The visualizer would have to first match the input to a certain level before proceeding to analyze specific finger joins in a multi-variable field, where several more series of thresholds determine if the user's input is a specific finger spelling gesture.
Accomplishments that I'm proud of
What I learned
What's next for RE:sign
lol its in the name