Saw how incredibly frustrating sign language was while miming messages across places, imagined how frustrating it must be for deaf people, and how much harder for the other person to understand.
What it does
Uses Leap Motion to track hand movements of American Sign Language, and parses the individual letters into text, and that text to words, which is then fed through Google Translator's text-to-speech engine and converted to voice.
How I built it
Leap Motion, Eclipse, Java, Google Translate's API, and distancing self from caffeine.
Challenges I ran into
Precisely converting finger movements into coordinates, managing Leap Motion's frequent reversal of hand directions and offsetting the several negative results we got to ensure precision with comfort in signing and speaking.
Accomplishments that I'm proud of
We managed to perfect over 15 letters in six hours, from mathematical data alone about the intermediate and distal phalanx bones of the human hand.
What I learned
Working with leap motion, working in an IDE, OS and language I'd never worked in and relying on it till the last minute, and the importance of planning out the goals from the start.
What's next for Sign2Speech
Improving reliability, adding better autocorrect features, adding user customization for adding special gestures to mean "thank you" and "hello" and other commonly used phrases in daily parlance, as well as configuring it for other language platforms.