Inspiration
Our journey began with a desire to bridge communication gaps between individuals who are deaf or hard of hearing and the rest of the world. Witnessing the challenges faced by this community in expressing themselves inspired us to create a solution that fosters meaningful connections through technology. We aimed to develop an app that not only translates sign language but also raises awareness about the importance of inclusivity in communication.
Challenges we ran into
- ASL Dataset Acquisition: Finding a comprehensive ASL dataset for training our model was a significant challenge. We had to create our own dataset.
- Hand Landmark Detection: Implementing MediaPipe for accurate hand landmark detection required extensive experimentation to ensure reliable gesture recognition in various lighting conditions and angles.
What we learned
Throughout the development process, we gained valuable insights into:
- Sign Language and Communication: Understanding the nuances of American Sign Language (ASL) and its structure was crucial for effective translation.
- Technology Integration: We learned how to integrate OpenCV for real-time image processing and MediaPipe for hand landmark detection.
- User-Centered Design: Designing an intuitive interface using Figma helped us focus on user experience, ensuring the app is accessible for both signers and non-signers.
Built With
- figma
- matplotlib
- mediapipe
- numpy
- opencv
- python
- scikit-learn
Log in or sign up for Devpost to join the conversation.