Inspiration
We believe that communication is the way for us as a society to build and foster our future, however, one group that is stifled in their ability to communicate with others is the deaf and mute community, inspired by our teams' shared interest in BSL
What it does
Our application scans hand movements and matches them against a dataset of BSL signs and displays a character corresponding to the sign
How we built it
We built this using OpenCV for the camera, mediapipe for hand recognition, pandas, numpy and sci-learn for the data frames
Challenges we ran into
We ran into many challenges on our endeavors. Most of the technologies we used today were new to us, namely machine learning and Neuphonics. Machine learning, especially as the main goal of our project was reading hand sign
Accomplishments that we're proud of
In the end, we managed to get the hand tracking down, converting signs to characters. We also managed to convert these characters to text for our mute user
What we learned
- Machine Learning (ML) : Getting a dataset and training a model
- Different Python libraries such as OpenCV, Tensorflow, Mediapipe, SciLearn, Numpy and Pandas in the context of ML
- British Sign Language (BSL)
- Trading, profit and risks
What's next for Signergy
We'd love to do the reverse - that is, converting speech into sign language actions the user can perform as education and improving communication is at the heart of this project and using hand gestures to scroll up and down the text box to see previous and current conversations/sentences and to delete words as we've only used keys on the keyboard to implement this so far.
Log in or sign up for Devpost to join the conversation.