Inspiration
We wanted to create something simple and elegant, yet able to positively impact many people. We also wanted to try using new technologies with our hack.
What it does
SignBuddy is a P2P platform that allows differently-abled people to experience full video chat with their peers. It can translate the 26 letters of the American Sign Language using the Leap Motion, and displays it on the screen for the user.
How we built it
We utilized a Python flask server to communicate with our simple HTML, CSS, and JavaScript front-end. The image / hand recognition was done by using support vector machines and trained through machine-learning algorithms from the python scikit-learn library. The hand is observed by the Leap-motion device with the palm as the origin and the joints of each finger measured as coordinates. Potential matches are then found by the application by comparing the present hand position with our trained sample data and finally, is displayed onto the screen as text.
Challenges we ran into
Initially we attempted to use Indico's API for image recognition, but in creating our custom collection of sign language images, we found that we were often making too many calls for our api key. As well, certain images in which the hand shapes were quite similar was very hard to differentiate for the image recognition system. Thus, we combined Indico's API along with the Leap-motion API for their hand object detection to complete our project.
Another issue came with the video chat, which we were unable to fully debug the WebRTC server infrastructure and thus, it wasn't fully functional online. Instead, we emulated the P2P chat via an external IP-enabled webcam.
Accomplishments that we're proud of
Figuring out a mechanism to do the sign-language recognition and finally being able to come up with a working product that properly implemented our vision.
What we learned
We learned a lot about working with databases and machine learning.
What's next for SignBuddy
Full P2P video and text-chat integration, word recognition instead of just letters, and increasing our data sample size to improve the quality of the sign language prediction.
Log in or sign up for Devpost to join the conversation.