Inspiration

I first thought of using a Myo for American Sign Language recognition. I made a hard-coded version of the gesture detection, and won my first Myo from that hackathon, but it wasn't as robust of an application as I hoped it would be. It didn't pick up gestures and movements, assigning them meanings as elegantly as I anticipated. LearnMotion is the natural successor of my first project.

What it does

LearnMotion allows users to essentially save the data representation of a specific motion on their account, and enables them to access it web and mobile platforms. Users can then compare the original "perfect" recording with their real-time practice data. For example, a communications disabled person can use this application to effectively learn and practice American Sign Language gestures by downloading them from the motionStore, and then using the application to pick up the motions when they're performing it live, to send a message. For example a friendly wave would be "hello" on a messenger application.

How I built it

The application essentially records motion data generated from the myo or pebble, saves them on the firebase database, and then the user can view all these perfect recordings on the application. They can choose a perfect recording, and then they can do the motion live, and the application will then give them live feedback based on how similar the the data representations were. We compared two data vectors using the Dynamic Time Warping (DTW) machine learning algorithm. The application was built for Android, Web, and iOS platforms, specifically using Node.js, Express, Myo, Pebble, Firebase, and DTW.

Challenges I ran into

There is the biggest challenge of the idea itself. Will it work at all? Can Recordings from different devices be compared the same on different devices with enough efficiency and speed? Other than that, the web version was relatively easy to make and test and experiment on. The difficult part was the implementation of the firebase, myo and pebble on the mobile iOS platforms and android. We have an incomplete iOS app because it's so difficult to make it work. Also it's very hard to make it a seamless experience, where comparison after comparison can be triggered by the user seamlessly.

Accomplishments that I'm proud of

The fact that I came up with this idea, was able to make a working web version, and we got a decent version on Android and really far on iOS makes me very proud and happy.

What I learned

Myo and Firebase is harder than I thought on Android & iOS

What's next for LearnMotion

Lot's of cool ideas, the ability to have everybody share their motion recordings and be compatible across mobile and wearable devices, more ML to have even more accurate and efficient analyses, and just a more robust app. I will be working on these applications with my team after this hackathon.

Share this project:

Updates