Inspiration

Our group was inspired after one of our teammates shared his experiences with constant ear issues. He also has a close friend who, apparently, is almost completely deaf. Knowing how difficult communication can be for people who are hard of hearing, we wanted to create something useful for them. We decided to build a tool that can make communication easier between people who use sign language and people who don’t.

What it does

SignFlow.AI is a webapp tool that watches hand movements through a camera and turns sign language into text on the screen. This helps people communicate in public with deaf people when there isn't always something to write with.

How we built it

We built SignFlow.AI using a basic hand-tracking code and translator. We trained the model to recognize common signs and connect them to words. Then we made a web interface where the translated text appears. We also added a chat box that shows predictions of what the person is trying to say during the translation process.

Challenges we ran into

One big challenge was getting the hand-tracking system to work and be accurate. Every time the model mixed up similar signs or just didn't input at all. Another challenge was making the interface run smoothly on different devices. We also had trouble finding enough high-quality, correct sign language data to train the model properly.

Accomplishments that we're proud of

We are proud that we built a working prototype that can recognize signs for the entire alphabet and translate them into text. We are also proud that we created something meaningful that can help people with hearing difficulties. Even though it’s not perfect yet, it shows real potential and we're proud of ourselves.

What we learned

We learned how visual recognition through coding works and how difficult it can be to train them. We also learned how to collaborate better as a team, especially when solving technical problems. Most importantly, we learned more about the challenges the deaf and hard-of-hearing community faces.

What's next for SignFlow.AI

Next, we want to improve the live sign accuracy, perhaps by training the model with more data. We also want to support full sentences and/or words instead of just single letters. Another goal is to add voice-to-sign translation. In the future, we hope SignFlow.AI can become a fully accessible tool used in classrooms, workplaces, and everyday life.

Extra:

We also made a chat to pdf project where we can upload files onto the application, which the AI learns from and can answer queries. This relates to accessibility since, this feature can be extremely crucial for getting smaller information in a large file.

Built With

Share this project:

Updates