Inspiration

A deaf person's struggles in Stats class inspired his classmate to create a tool that can help all deaf people communicate more effectively with others and vice versa.

What it does

Using the Myo armband, a deaf person can use sign-language and transmit his/her to an Android app that will voice and transcript his/her signs. Those who aren't deaf can also communicate, as their speech translates to text that the deaf can read.

How I built it

We trained a Myo armband with machine learning. Using the KNN (K-nearest neighbor) algorithm, our code can classify gestures and transmit the correct message. Our Azure Speech Cognitive Service then transmits the message into audible speech. A person can then speak and have his/her message transcribed for the deaf to read.

Challenges I ran into

We tried to run Speech to Text service in Azure, but our student authentication was denied. With only a little time left, we decided to pivot to focus on submitting on time.

Accomplishments that I'm proud of

Using KNN. Using Azure Cognitive Services.

What I learned

Sampling. KNN. Machine learning. Azure Speech Cognitive Service. Postman API calls. Android Studio

What's next for TalkToTheHand

We plan to integrate a filter on the raw data provided by the myo armband to increase our accuracy. We plan on expanding our dictionary of gestures to include facial and head movements too.

Built With

Share this project:

Updates