With the lack of an efficient, cost-effective manner of communication for those without the ability to speak, we felt that the Leap Motion would prove to be such a method to fill this gap.
What it does
How we built it
We used the Leap Motion to first record the hand motions that are used in the American Sign Language. The letters were transferred into another program that continuously provides the individuals with suggestions for the completion of their word. Then, the words were stored into a Firebase data storage and access through another terminal and displayed on a web interface to provide the user an easy-to-access method for using the Leap Motion.
Challenges we ran into
We had most of our challenges within the transfer of the auto-corrected words back into the sentence being recorded by the web interface. Additionally, the formatting for the Open Weather Map was less than optimal for our uses, but we were able to manipulate it in order to make it easier for a user to understand.
Accomplishments that we're proud of
We are quite proud of our team member Sidharth's efforts to translate the entire American Sign Language into a pattern based system for the Leap Motion sensor to understand. Furthermore, after much trial and failure, we were able to use the Firebase storage to produce a real-time display of the signed sentence while it was being recorded by the Leap Motion.
What we learned
We learn much about the different ways that data storage can be used and how different terminals can access such a storage. For example, after deliberating between MongoDB and Firebase, we determined that Firebase would provide use and the users a more efficient method of translation from ASL to auto-corrected text.
What's next for Leap Sign
Currently, we are in talks about when it would be most suitable for all of us to meet up more often in order to continue working on the Leap Sign.