Inspiration
Communication today is still limited by language barriers, accessibility challenges, and the need to constantly switch between tools. Our inspiration was how people with hearing disabilities did not have a proper way to communicate. Due to the lack of assistive technology, we took it upon ourselves to make this change. On top of this, we made our website not only limited to ASL users but also to all users who don't speak English.
What it does
Our project makes talking to people with different backgrounds more convenient. Instead of relying on separate apps for translation or interpretation, we created a unified messaging platform that allows people to express themselves naturally, regardless of language or ability.
How we built it
We built the platform using a React frontend, a Node/Express backend with Socket.IO for real-time chat, and a Python Flask service for sign detection. To recognize ASL letters, we used MediaPipe Hands to capture hand landmarks from live camera input and passed that data into a TensorFlow/Keras model trained on letter gestures. The Flask service processed those predictions and sent them back to the app, where the detected letters were added straight into the chat box.
On top of that, we included speech-to-text, text-to-speech, and translation features so users can communicate in whatever way feels easiest for them. Putting all of this together let us build a single platform that makes signing, speaking, typing, and chatting work seamlessly in one place.
Challenges we ran into
Our biggest challenge was training the AI model to understand sign language with strong datasets. Sign language is highly expressive and nuanced, which made our jobs that much more difficult. On top of that, we also had to ensure that our platform remained accessible and easy to use for people.
Accomplishments that we're proud of
Our AI detects sign language relatively quickly despite having time limitations when training our AI. Successfully bridging the gap between spoken and sign language was a major challenge that we were able to overcome.
What we learned
Thanks to the workshop, we learned how to navigate Flask. We also learned some sign language when training our AI, and most importantly, we learned how to communicate with our teammates effectively to get work done. Through this project, we learned that communication is more than just exchanging, but also it’s about understanding meaning, intent, and accessibility.
What's next for Bell Messenger
Due to the lack of time, we were only limited to ASL letters, but with a bigger database, we have more time to train AI models to better detect words and make translation times faster. We plan to improve our context-aware translation system by incorporating more advanced models, allowing conversations to feel even more natural and accurate across different languages.
Log in or sign up for Devpost to join the conversation.