Almost everyone has seen Jedi use the Force in Star Wars or wizards use magic in Harry Potter to control all sorts of objects. Our team wondered if we could implement a similar ability in real life, allowing for hands-free control of all kinds of smart devices.

What it does

Our project, MYOwn, allows for users to create and save their own gestures to control their IoT devices. Users wear a Myo armband from Thalmic Labs, sync it with their phone and computer, and then are able to customize the functionality of devices like lightbulbs, security systems, media players, etc. based on the motion of their hand and their current location.

How we built it

Our team combined a number of different platforms and tools to make MYOwn a reality. We realized that we needed to first be able to save and detect the custom gestures. Machine learning using SVMs allows the user to quickly train a classification algorithm with approximately 5 seconds of holding the pose, and then the SVM is able to reliably determine the pose in the future using the 8 EMG sensors in the Myo armband. The control of syncing and other processes is done via a complementary Android application. The Android app also relays GPS data to the server to dictate what the response must be. We use Amazon Web Services for our server, which also connects to the various IoT devices that need to be controlled.

Challenges we ran into

One major hurdle we overcame was analyzing the EMG data from the Myo. Since EMG data is mostly a series of sharp spikes going up and down, it was a challenge to identify appropriate preprocessing and classification algorithms to use for our case. Ultimately, we tested several different approaches before settling with the SVMs.

Another challenge was working with the limited Myo developer tools. While Thalmic Labs provides limited data from the Myo, we needed the full EMG data in order to achieve our goal of custom gestures, and this required us to hack and come up with some creative workarounds.

Accomplishments that we're proud of

Our team is extremely proud of our machine learning setup. Even though our members are all high schoolers and have very limited experience with machine learning, we were able to effectively research and implement solutions from the Internet.

What we learned

We learned that hard work and dedication can make any project a success.

What's next for MyOwn Custom Gestures

Currently, the limitation of processing power means we can only store 1 custom gesture at a time. As we increase our training data set and increase the ability of the SVMs to handle more test cases, we hope to expand the functionality of MYOwn. Additionally, we will look into incorporating IFTTT so that this tool can connect with many other IoT devices.

Share this project: