Inspiration

What it does

When a gesture is performed, raw EMG and accelerometer data is collected using a Myo armband, passed through digital filtering, and fed to an Azure-based supervised machine learning algorithm. After a large set of sample data, the algorithm is able to predict the identity of a gesture based on it's EMG and accelerometer signature. This allows for custom Myo gestures to be created.

We trained the armband by repeatedly performing American Sign Language gestures on it hundreds of times. After the training stage, the armband recognized gestures with 80% accuracy.

How I built it

Challenges I ran into

Accomplishments that I'm proud of

What I learned

What's next for Wearable Machine Learning Sign Language Translator

Share this project:

Updates