Inspiration
I want to learn sign language interactively. After all, staring at a youtube video trying to match their hands is kinda difficult
What it does
A real-time sign language interpretation system that uses computer vision and AI to recognize and translate American Sign Language (ASL) gestures. The application features both practice and learning modes to help users learn and improve their sign language skills.
How we built it
Frontend: HTML5 CSS3 JavaScript Webcam API
Backend: Python Flask Google Gemini AI API PIL (Python Imaging Library)
Challenges we ran into
Accomplishments that we're proud of
The ai effectively captures simple, common signs.
What we learned
For future development, we have to consider fine-tuning our model for optimal results
Log in or sign up for Devpost to join the conversation.