Inspiration - Everyone wants to communicate , then why not everyone ?
What it does - It helps us to understand the person whom we can't hear but feel
How we built it - Using transfer learning approach we tried to built a deep learning model over Indian sign language dataset
Challenges we ran into-model was inaccurate and feature extraction was a heck in this task
Accomplishments that we're proud of- It helps the disabled feel accepted and not an outlier , as here we are doing something for them.
What we learned- flask , next.js , teamwork ,machine learning approaches.
What's next for HANDS-WE SPEAK- will try to redevelop with an android application along with a tutorial for common man to understand
Log in or sign up for Devpost to join the conversation.