Inspiration
When conceptualizing for this hackathon, Jack started to think about things that would help people. A chat for disabled people came to mind, and after bouncing ideas off the rest of the team, SignVision was born. We came up with this idea to help the hearing impaired around the world, in order to understand them more.
What it does
SignVision allows for effective communication between fluent speakers and ASL signers. Connect to a session, choose a name, and get started with your call! Our client allows two methods of communication: speaking and signing. Speakers can click the button, say something, then click it again to process what they said. The app then converts the speech to text, and sends it through the chat box. Signers can record an ASL phrase, and SignVision will decode that into English via machine-learning, sending it through the chat box. This translation process eliminates the need for an interpreter, and can provide a much more direct and personalized form of communication between signers and speakers.
How we built it
The SignVision web app utilizes basic web languages (HTML, CSS, Javascript), but also utilizes jQuery, websocket.io, and IBM Watson's Speech-to-Text and image recognition API's.
Challenges we ran into
One of the biggest challenges was getting the sign language translation. After going through two other libraries, we decided on Watson's API, as it proved much more useful in this situation.
Accomplishments that we're proud of
Our team is proud to provide the first real-time ASL translation video chat between two clients, directly on the web.
What we learned
Not only did Nathan, our team member who trained Watson, learn many machine-learning skills, we also learned a whole lot of patience. A lot. In terms of coding, however, we learned how many professional API's work, and how to properly utilize them.
What's next for SignVision
As for what's next, we plan on adding more accurate recognition, in addition to ASL video tracking. Currently, it only supports images, but if we were given a few more hours, it would be up and running with a live ASL tracking feed!
Log in or sign up for Devpost to join the conversation.