Our inspiration behind this project was to bring communities closer and more accessible to each other. This idea also intertwined with the theme of the Hackathon "Connecting the Dots".
What it does
The program helps answer your queries asked in both spoken or sign languages. To assist people regardless of their disabilities. If a visually impaired person wants to communicate with a mute person, the sign language will be converted to speech so that he will be heard.
How we built it
We trained our whole deep learning model in google cloud. we trained our gesture recognition in AutoML and deployed it in google cloud. We made an app in android studio to take photos or videos of the gestures and then give the fitting reply. Our backend was hosted in Google Cloud.
Challenges we ran into
We hosted our whole project in Google Cloud, and it takes a lot of time to deploy such a huge model in the cloud. Since it was our first time using Cloud, we felt many problems and people form Google helped us get through it.
Accomplishments that we're proud of
Our app was able to bring physically challenged communities closer to normal communities because they should not feel seperated from others.
What we learned
During the starting of Signally, we put ourselves in the shoes of the disabled people to understand their problems. We drew inspiration from it and then tried to implement our ideas to make it feasible for the disabled people to use it. We trained our model in AutoML to create a model that could detect any sign language so that it could be changed into the text as well as speech. It helped not only mute people to convey their message, but it also helped them to get the answer in their own language.
What's next for Signally
We want to make it a utility app. The app needs to be with every physically challenged person so that it becomes a peer-based network popular among them. We can tie up with companies so that we can give gift cards or goodies to people who can give prizes to the people who help others.