If we take a moment to stop and think of those who can't speak or hear, we will realize and be thankful for what we have. To make the lives of these differently ables people, we needed to come up with a solution and here we present you with Proximity.

Proximity uses the Myo armband for sign recognition and an active voice for speech recognition. The armband is trained on a ML model reads the signs made by human hands and interprets them, thereby, helping the speech impaired to share their ideas and communicate with people and digital assistants alike. The service is also for those who are hearing impaired, so that they can know when somebody is calling their name or giving them a task.

We're proud of successfully recognizing a few gestures and setting up a web app that understands and learns the name of a person. Apart from that we have calibrated a to-do list that can enable the hearing impaired people to actively note down tasks assigned to them.

We learned an entirely new language, Lua to set up and use the Myo Armband SDK. Apart from that we used vast array of languages, scripts, APIs, and products for different parts of the product including Python, C++, Lua, Js, NodeJs, HTML, CSS, the Azure Machine Learning Studio, and Google Firebase.

We look forward to explore the unlimited opportunities with Proximity. From training it to recognize the entire American Sign Language using the powerful computing capabilities of the Azure Machine Learning Studio to advancing our speech recognition app for it to understand more complex conversations. Proximity should integrate seamlessly into the lives of the differently abled.

Built With

Share this project: