💡 Inspiration

As developers, we've always been passionate about using technology to make a positive impact on society. When we learned about the challenges faced by people who use sign language to communicate, and the difficulty they often have in accessing technology that can help them, it really struck a chord with us.

We were inspired by the incredible resilience and creativity of people who use sign language to communicate every day, despite facing significant barriers in their daily lives. We knew that there had to be a way for technology to help bridge this gap and enable more people to communicate more easily, more confidently, and more comfortably.

That's what led us to start working on this project. By using machine learning to identify sign language gestures, we believe that we can create a tool that is truly transformative for people who use sign language. It has the potential to break down barriers, to enable more inclusive communication, and to empower individuals to express themselves in ways they may not have thought possible before.

This project has brought together our passion for technology with our commitment to inclusivity, accessibility, and social impact. We are proud to be working on something that has the potential to make a real difference in people's lives, and we're excited to see where it goes from here

As we continued to work on this project, we were struck by the potential for it to create opportunities for people who use sign language. By providing a more accurate and robust way to interpret sign language gestures, we believe that our system can open up new possibilities for communication, education, and employment. Imagine being able to attend a lecture or job interview conducted in sign language, from anywhere in the world! We believe that this is just the beginning of what technology can do to help create a more inclusive society, and we're excited to be a part of it.

🤖 What it does

Our sign language detection system (Sign Ease) is designed to help people learn and improve their sign language skills. By using machine learning to identify and interpret sign language gestures, our system acts as a virtual sign language teacher that can provide real-time feedback and guidance. To use the system, a user simply needs to perform a sign language gesture in front of their webcam or smartphone camera. The system then captures the video feed (Real time) and compares the user's gesture to our database/images of known sign language gestures. If the gesture is recognized, the system provides feedback on the correctness of the gesture and offers real time learning experiences.

Also, we have a quiz app functionality integrated that people can quick revise/learn. Like a cheat sheet

Overall, we believe that Sign Ease has the potential to be a game-changer for sign language education. By providing a fun, interactive, and personalized learning experience, we aim to help more people learn sign language and improve their communication skills."

🧠 How we built it

Planning and Research: Before starting development, our team spent time planning the project and researching the technologies and tools that would be required to bring it to life.

Front-End Development: We used React, a popular front-end JavaScript library, to build the user interface for our application. Additionally, we utilized Tailwind, a CSS framework, to create a responsive design and streamline styling.

Back-End Development: For the back end, we chose Python as the primary programming language and Streamlit as the web framework to build the server-side logic and business rules. This allowed us to easily develop interactive data applications and ml models without requiring advanced web development skills.

Computer Vision: To implement computer vision functionality, we used OpenCV and MediaPipe libraries. OpenCV is an open-source computer vision and machine learning software library, while MediaPipe offers customizable building blocks to enable developers to create their own ML pipelines.

Testing and Deployment: Once development was complete, we tested the application thoroughly to ensure that it functioned correctly and met all requirements(somehow theres a invisible bugs we can always fix ). Then, we deployed our application to github pages for static quiz app and redirect to streamlit app for webrtc(Web Real-Time Communications).

Overall, building a project like ours required expertise in a range of technologies and tools, as well as strong project management skills to ensure that everything came together seamlessly. But we don't have such expertise. However, we learned our mistakes to improve ourselves

🧩 Challenges we ran into

  • Training ml models with real worlds sign language datasets
  • configuration of opencv, mediapipe
  • integrating with Streamlit, it was our first time
  • Team members from difference timezone
  • Completely new stacks tech for some of us.
  • Tried to integrate Mongo dB with react but cannot because we did not have sufficient time + it was our first time. Hope we can do that in future. (Not valid line if we able to integrate before deadline)

🏆 Accomplishments that we're proud of

  • Funcitonal React App
  • Functional gesture detection model.(Trainned by self)
  • Use of model and computer vision for predictions.
  • Functional Streamlit app.
  • Complete strangers to friends

💻 What we learned

  • React
  • tailwind
  • npm cli
  • Git and Github
  • Streamlit [https://streamlit.io/]
  • webRTC
  • Computer Vision Libraries
  • Machine learning Libraries

🚀 What's next for Sign Ease

1.Further refining the accuracy and reliability of the system, perhaps through additional training data or algorithm improvements.

2.Exploring potential commercial or research applications for the technology, such as improving accessibility for people who are deaf or hard of hearing or using sign language recognition in *virtual or augmented reality environments. *

3.Collaborating with experts in the field, including sign language interpreters and members of the deaf and hard of hearing community, to ensure that the system is culturally sensitive and effective.

4.Continuously monitoring and testing the system's performance in real-world settings and making adjustments as needed based on feedback from users.

Overall, the possibilities for sign language detection technology are exciting and promising, and we look forward to seeing how our project gets sponsors in the future!

📈 Why Sign Ease?

Sign Ease is a project aimed at making it easier for people to learn sign language. There are many reasons why this is important.

Firstly, sign language is a vital means of communication for many individuals who are deaf or hard of hearing. Without access to sign language, these individuals may struggle to communicate effectively in daily life, leading to social isolation and exclusion.

Secondly, learning sign language can be challenging for those who do not have access to resources or qualified teachers. While there are many online resources available, they are often fragmented and difficult to navigate. This can make it challenging for people to learn sign language effectively.

Finally, sign language is an important part of global culture and heritage. By making it easier for people to learn sign language, we can help to preserve and promote this cultural treasure.

These are just a few of the many reasons why Sign Ease is such an important project. With our tool, we hope to make sign language more accessible and easier to learn for people around the world.

Built With

Share this project:

Updates