Team Number 112 Sign language is one of the oldest and most natural form of language for communication, but since most people do not know sign language and interpreters are very difficult to come by we have come up with a real time method using neural networks for fingerspelling based american sign language.
In this method, the hand is first passed through a filter and after the filter is applied the hand is passed through a classifier which predicts the class of the hand gestures. This method provides 98.00 % accuracy for the 26 letters of the alphabet.
American sign language is a predominant sign language Since the only disability D&M people have is communication related and they cannot use spoken languages hence the only way for them to communicate is through sign language.
Communication is the process of exchange of thoughts and messages in various ways such as speech, signals, behavior and visuals.
Deaf and Mute(Dumb)(D&M) people make use of their hands to express different gestures to express their ideas with other people.
Gestures are the nonverbally exchanged messages and these gestures are understood with vision. This nonverbal communication of deaf and dumb people is called sign language.
Built With
- communication
- computer-vision
- deep-learning
- hand-gestures
- machine-learning
- neural-network
- opencv
- python
- sign-language
Log in or sign up for Devpost to join the conversation.