Inspiration

This Space Intentionally Left Blank

Actual Inspiration

Every hackathon I typically pack a bunch of random hardware and figure out something to do with it. For example to this Hoya Hacks I brought a couple Arduinos, wires, LED/LCD screens, a Myo armband and a Leap Motion. In the past I've been able to smash hardware together with few to minimal actual coding/software involved. I'm proud that this hackathon was different as even though we worked with the Leap Motion (and borrowed another for testing) this was also the first hackathon with further exposure to the software side of things. Personally it was an introduction to JavaScript and web app development and it was a lot of fun to experiment around with.

As for the specific hack itself, it by itself doesn't have significant social impact but overall the trends of society and technology dictate that in the future multidisciplinary fields will come together to solve many problems we face today, and computer science/programming is a vessel for many of these fields to work through. Our simple app was a demo of an idea with greater potential to the deaf community as information and communication become easier with rapidly evolving technology, so too does it help those impaired by disabilities. Although complete American Sign Language is beyond the scope of the hackathon and the capabilities of a Leap Motion, we're still proud to have been able to achieve recognition of most of the alphabet in finger spelling format.

What it does

The user can make basic finger spelling gestures above the Leap Motion, to which the web application will display a visualization of the user's action and print out the corresponding letter.

How I built it

  • The entire project is an HTML file, client side
  • A Leap Motion for gesture detection and control
  • Javascript/Three.js/LeapJS to display the visualization

Challenges I ran into

  • No prior experience in JavaScript. Did not take any intro courses or codeacademy for it, just dived right into the deep end.
  • ^ Needless to say the above was a significant challenge.
  • Getting the visualizer running on the webpage using Three.js and LeapJS.
  • Thanks to the highly sophisticated and cutting edge technology of a $70 gadget, there were absolutely no issues at all in reading in clear and specific data. The Leap Motion naturally reads user input flawlessly.
  • A major challenge our team encountered was to be able to read data (such as axis, directions, confidence, etc) from the Leap Motion at the same time to be able to run a visualizer of the Leap Motion input. We encountered a bug where trying to run the two in two separate loops caused it to crash. With the help of mentors and careful debugging however we eventually overcame this challenge (naturally missing a single line of code, go figure) and was able to proceed.
  • A more theoretical challenge was once we solved the above issue, was how to use the data to recognize finger spelling gestures. The approach we took was setting a "confidence" threshold on the Leap Motion where as the visualizer may not match the user's hand input. The visualizer would have to first match the input to a certain level before proceeding to analyze specific finger joins in a multi-variable field, where several more series of thresholds determine if the user's input is a specific finger spelling gesture.

Accomplishments that I'm proud of

everything.

What I learned

this was not a beginner friendly intro to javascript

What's next for RE:sign

lol its in the name

Share this project:

Updates