Inspiration
360 million all over the world suffer from hearing disabilities. Be it while communicating on a day-to-day basis or in emergencies, they face so many difficulties. Technology has advanced so much, but we still don't have an app to help people who are deaf communicate. An app to translate sign to text and to speech. There is no platform to learn the ASL alphabet and what is better if you can learn the sign language from the comfort of your homes, in front of your laptops, through a machine. There are a lot of NGOs who take initiatives but they are left unheard, unseen. Snapchat is an application which is used by 498 million people, all over the world. Out of these, 82% comprises of youth who are really encouraging and supportive to these new ideas. So, we thought of building an app through Snap-kit.
What it does
- The app makes use of the Snap Login kit to maintain the privacy of the users and log them in seamlessly.
- The Creative Kit allows users to share the website on their snaps and influence more and more people towards building an inclusive community, making the website interactive.
- The Bitmoji Kit allows the community to express their personality with just a tap and also make the website fun and interactive.
- The user can 'Talk With Sign', which means users can point the camera at the other person, and see the translation of Sign to Text and Speech.
- The Practice Mode allows user to learn the ASL alphabet which will ultimately prove to be very helpful in times of emergencies for emergency responders, teachers, and everyone actually!
- The Forum allows people to come together, discuss their experiences and help each other out.
- The Webinar feature makes the app a one stop platform to hold all webinars, and join them from the website itself. The Creative Kit share button will help increase the reach and make people aware.
- The translators convert English to ASL and ASL to English.
- The list of common words increases the Sign Knowledge of the user.
How we built it
We built it using Node.js, Express and AstraDb. We used SnapKit to integrate Login Kit, Bitmoji Kit, and The Creative kit into our application. We used Snap SDKs to integrate the above kits into our website. For the frontend, we used HTML, CSS, Javascript. We trained the ASL models using Fingerpose and Mediapipe.
Challenges we ran into
We were naive to the Snap Kit, so it took us a little time for us to integrate the Kits into the app. We have coded in Javascript only, so to find a solution to train the models in Javascript took some time for brainstorming. We also made the website responsive to launch it on a mobile.
Accomplishments that we're proud of
We were pretty excited about integrating Snap into out app, and we are proud of eventually integrating it. We built a full-stack application to solve a real-life problem to help the community and make our contribution to it. Also, we used only javascript to train our models, which we are really proud of.
What we learned
We learnt about the SnapKit. We infact, ourselves learnt a lot about Sign language through the course of building this project. We learned about Machine Learning in Javascript.
What's next for UniSign
We wish to convert the website into an app to make it more useful and accessible. We also wish to scale the dataset by adding more signs for the user.
Test Credentials
username - test_accoun2523
password - test@123
Built With
- astradb
- bcrypt
- body-parser
- bootstrap
- connect-flash
- css
- ejs
- express.js
- google-cloud
- html
- javascript
- jquery
- mediapipe
- node.js
- snap-kit
- sweetalert
Log in or sign up for Devpost to join the conversation.