Inspiration
I first started learning ASL through my brother, whose friend is deaf and they tell me about challenges they face in the classroom. This inspired us to make a ASL to English translater to empower kids.
What it does
ASLConnect is an efficient communication tool, bridging the gap between the deaf and hearing. The project utilizes live streaming technology and advanced machine learning to recognize and translate ASL gestures into English in real time.
How we built it
We implemented a gesture recognition model sourced from MediaPipe to detect the "I love you sign."
Challenges we ran into
The substantial data requirements for training machine learning models, needing thousands, if not more, examples of each gesture led to our decision to use MediaPipe. And given the time constraints, it would be unfeasible to train a comprehensive model to recognize even a single ASL letter.
Accomplishments that we're proud of
We got the project running with detection of each point on the hand retalative to the palm and recognize a gesture.
What we learned
There is no service for live ASL translation available at all yet.
What's next for ASLConnect
We plan to incorporate additional ASL signs into our system, our strategy entails utilizing the custom gesture feature provided by MediaPipe. This feature will enable us to train the model to recognize and interpret new ASL signs, thus expanding the system's sign language vocabulary and enhancing its functionality. We can also plan the incorporation of a translation API, to convert ASL gestures into multiple languages. This feature is designed to enhance the global accessibility of our system.
Log in or sign up for Devpost to join the conversation.