Inspiration

After seeing a MyoBand being used to classify hand gestures and act upon that data, we decided to try and classify even more gestures using machine learning and neural networks

What it does

The MyoPiano app allows a user to connect a MyoBand and then allows full control of an onscreen piano to the gestures recognized by the MyoBand

How we built it

We built the iOS app in XCode, largely with Swift. We trained our model using tensorflow and then imported it into the app. We had to bridge our tensorflow model using Objective-C++, however.

Challenges we ran into

Given the 24-hour time frame, we weren't able to collect enough data to provide a reliable experience to the user and gesture recognition is very finicky. This was our first experience with iOS development as well, and the learning curve was steep.

Accomplishments that we're proud of

We are proud of building a full-function app where the Myo can control and play the piano, even if the model isn't entirely accurate

What we learned

A lot about iOS development, Swift, and bridging from Objective-C to Swift. We also learned about training models and collecting data for a neural network

What's next for MyoPiano

Built With

Share this project:
×

Updates