People who have disabilities like being deaf or mute have higher sucide rates than the general public(30-40% higher risk). They also have 4 times as much of a chance of having a mental illnesses according to the National Library of Medicine

Our app uses nodes and public and private databases together to allow users to sign in Sign Language and it will translate it into text and speech.

We trained it off of public datasets, like Kaggle's public ASL dataset, as well as our own data. We ran into lots of challenges. The biggest was our accuracy. It wasn't great, and this was because of the data that we were using. It was very low quality, and it wasn't even cleaned. But once we fixed and cleaned the data, our accuracy went from around 4% to 95%.

We think that HandSpeak has a very big future. We want to expand the model more to be able to identify more words.

Built With

Share this project:

Updates