Inspiration

As a team, we are excited to have developed a sign language detection app that has the potential to make a significant impact on the lives of individuals who are deaf or mute. We believe that the app has the power to revolutionize communication for people with hearing or speech impairments, by making sign language more accessible and easier to understand for people who are not familiar with it.

What it does

One of the key benefits of the app is enhanced communication. We recognize that communication barriers can be a significant challenge for individuals who are deaf or mute, often leading to social isolation and exclusion. Our app can help facilitate communication between people who are deaf or mute and those who don't know sign language. With the app's ability to detect and translate sign language into spoken or written language, conversations can be more seamless and inclusive.

We also believe that the app can promote independence for individuals who are deaf or mute. By enabling them to communicate more effectively, the app can give them more autonomy and reduce their reliance on others. This can help them to participate more fully in conversations, navigate a variety of situations with greater ease, and enjoy a higher quality of life.

The app can also be valuable in educational settings, such as in classrooms where deaf or mute students are present. Our team recognizes the challenges that these students can face in communicating with their peers and teachers. Teachers can use the app to translate their spoken words into sign language, making it easier for students to understand and participate in the lesson. This can help to create a more inclusive environment for all students and enhance the learning experience for those with hearing or speech impairments.

How we built it

The sign language detection app we have developed was built using a combination of cutting-edge technologies, including TensorFlow, Keras, native Android, machine learning, Kotlin, and Java. Our team used machine learning algorithms to develop a model that could recognize various sign language gestures and translate them into written or spoken language. We utilized TensorFlow and Keras to train the model and optimize its accuracy. We then integrated the model into the native Android app using Kotlin and Java. The app uses the device's camera to capture the sign language gestures, and the model runs on the device to quickly detect and translate the gestures.

Challenges we ran into

Integrating TensorFlow with a Kotlin app posed several challenges for our team. One of the main challenges was ensuring that the model could run smoothly on the device without impacting its performance. We had to optimize the model's size and efficiency to ensure that it could run efficiently on mobile devices. Additionally, we had to ensure that the model could seamlessly integrate with the Kotlin app, which required careful coordination between our machine learning and app development teams. We also faced challenges in debugging and troubleshooting issues related to the integration.

Accomplishments that we're proud of

Our team is proud to have developed a sign language detection app that utilizes cutting-edge technology to break down communication barriers and promote inclusivity. We feel that our app has the potential to significantly impact the lives of individuals who are deaf or mute and make the world a more accessible and inclusive place.

What we learned

We learned about the challenges and opportunities of integrating machine learning with mobile app development, as well as the importance of collaboration and user-centered design in creating effective solutions.

What's next for SIGNx

The next steps for the sign language detection app involve further improvements and expansion of its capabilities. We plan to refine the machine learning model to improve its accuracy and expand its ability to recognize more sign language gestures. Additionally, we aim to add features such as a text-to-speech function, which will enable the app to not only recognize but also vocalize the translations. We also plan to explore partnerships with organizations that serve the deaf and mute community to gain feedback and insights on how to enhance the app's usability and impact. Ultimately, our goal is to continue to improve and evolve the app to better serve and empower individuals who are deaf or mute.

Built With

Share this project:

Updates