We decided to work on this project because we wanted to help solve something in the world, something meaningful that would be of use to the disabled.

What it does

Transign utilizes the leap motion to read the finger positions and output the corresponding letter. The letters are then taken and put through google translate, which is then outputted into an mp3 file using a text to speech converter.

How we built it

We built it using the leap motion sdk, java and python. It's basically using the leap visualizer and listening for the finger positions. From there, it uses our algorithm to output the letters into the interface.

Challenges we ran into

We ran into lots of challenges while creating this but the most challenging was actually making sure the letters work, the finger positions were really hard to comprehend using the leap motion. Some of the letters get confused with others when fingers overlap in the visualizer.

Accomplishments that we're proud of

We are accomplished that we got it to work but only the interpretation file

What we learned

We learned a lot about the leap motion, it's uses, and how it works overall.

What's next for Transign

We want to finish up the translation part of the app and actually test it out with other people

Share this project: