Inspiration
The hearing disabled population rely on reading lips/ and gestures or text conversation for communication.
Due to the outbreak of covid, work from home become the new normal. Meetings and online classes are conducted via zoom and social gatherings via gather town. The nature of online communication may disadvantage the hearing disabled population in educational and employment opportunities, as not all people turn on their camera during meetings, speech to text transcription is not available on most of the conference call software and the quality of the camera may affect communication.
In light of this, we would like to build a web plugin for real-time speech to text for the mute/ hearing impaired population to understand what's going on in group discussions/ meetings/ interviews and real-time hand gesture conversion to audio for them to express themselves.
What it does
Speech to text
- real-time speech to text conversion
Sign language translation
- translate sign language to text
How we built it
UI design using Figma
Front-end development using HTML, CSS and JavaScript
Back-end development using Flask and Python
Hands detection using mediapipe
Challenges we ran into
Over half of our team has no experience in backend development, we spent a lot of time learning and trying things out.
Our initial plan was a bit ambitious to include sign language translation to audio. We followed the mediapipe guide to implement the hand detection, however, we have encountered several bugs during this process (such as in overlay and canvas drawing), and thus unable to proceed with this feature.
Finding the perfect dataset for our working module was also another big challenge we are facing. We got to know a large number of datasets, but it was a hard run in matching with our necessity.
Accomplishments that we're proud of
We built a prototype! ;p
We have implemented the speech to text transcription to the preferred language of the user using python. We need to integrate it with our front-end.
We have also implemented a hand gesture detection module that tracks our hand's movement. Since our target audience would express their communication through hand gestures, we have built this module. We are currently looking for the dataset that consists of the sign language vocabulary so that we could accomplish our goal in sign language to text transcription.
What we learned
We explored a lot of interesting packages both in Javascript and python. We learned a lot about the basics of javascript and connecting the app to the camera module
What's next for Converse
Fully functional app
Plugin version for meeting software such as zoom/ teams
Improve accuracy and translation speed
Built With
- css
- figma
- html
- javascript
- mediapipe
- python



Log in or sign up for Devpost to join the conversation.