Inspiration
Roughly 70 million people to this day need to use sign language to communicate. Unfortunately, most people do not can communicate back with this group of people. Using the leap motion camera, we can capture hand signs to help interpret and translate what a person is signing.
What it does
By pinpointing locations on your palm and fingers, the leap motion can interpret movement and gestures created by your hands. After being trained by our AI, it will determine likely words that the sign movement produced.
How we built it
We integrated the Leap Motion camera and with the help of the SDK we got to the frames in which we move our hands. By calculating the location of of over 22 points, we can determine the bend, pitch, yaw of our hands and match it with our data. On the server end, we have an AI which can process all this information and generate a list of possible matches.
Challenges we ran into
Our data set is outdated and was captured using different technology. Training our AI through tensor flow was difficult as it took a ton of time to import our data and to really figure out the black box of magic that is computer ai!
Accomplishments that we're proud of
We were able to do our first hardware hack! It was definitely a challenge using two new sets of apis and learning python on the way. In the end we got to create something that could integrate the two and it was something we would never get the chance elsewhere.
What we learned
- [x] How to use python
- [x] A better understanding of Leap Motion
- [x] Computer AI
What's next for Project Sign
Excusably, there are a ton of bugs and issues with this project. But for the first step, it would be interesting to have more data and time to train our AI so it could give more accurate word results. Our first goal would be to make it robust and working!
Built With
- leap-motion
- python
- tensor-flow
Log in or sign up for Devpost to join the conversation.