hardware portion of the glove
wearing the glove while signing
The app outputting the translated sign language
the app while it records data from the sensors
the app calibrating the sensors
After working at the blind school in New Dheli, I was amazed by the ease of which blind and deaf boys and girls could communicate in sign language. I realized that while sign language worked for people who knew the signs, communication is near impossible to those who are unfamiliar with the symbols. I set out to give to the blind and deaf, a voice and bridge the communication gap between all people.
What it does
Sign Waves, uses a glove equipped with flex sensors and 9 degrees of freedom orientation sensors to recognize gestures and various sign language symbols. Compared to other designs, the glove is compact, and wireless. It uses Bluetooth 4.0 to communicate with the Android application I developed. After calibrating the flex and orientation sensors to get baseline readings, the user can then train the app to learn additional gestures. Then when the user would like to communicate with others, the app acts like a translator, the user signs, and the app uses machine learning to translate the gesture into English and uses Text-to-Speech, to narrate the users thoughts.
How I built it
The glove uses an Intel Edison, along with an orientation sensor hooked up with a battery allowing the entire clothing piece to be wireless. As a user performs a gesture, the sensors record this data and advertise it to the phone using Bluetooth LE. The app then parses this data and uses uses a moving average to smooth over irregularities and jumps. We establish a baseline, by allowing the user to open and close his fist.
Proceeding voltage values are normalized within this min and max, and range from 0 to 1. This is necessary for the next step where we use this data to train a Support Vector Machine classifier. This classifier is then used for translating gestures into letters.
Challenges I ran into
Using the flex sensor and the Intel Edison with Bluetooth was a challenge due to their unreliability. However, by engineering robust software, the code can now deal with a number of corner cases. Addittionally the number of features per data point was 13 (9 from the orientation sensor and 4 from the other sensor).
Accomplishments that I'm proud of
I was able to build a usable device for those who speak sign language. We are hopeful, this will aid and impact many lives, when I take this India next.
What's next for Sign Waves
This weekend, we only dealt with the letters of the alphabet, however we look forward to expanding the library of gestures the app can recognize.