We wanted to simplify touchfree control of a mobile phone. We decided to build a prototype glove that recognizes gestures and sends them to your device.

How it works

Our prototype connects to the cloud with the electric imp dev board, and uses an accelerometer to determine the orientation of the hand. A flex sensor running along the middle finger distinguishes between open hand and fist, while sensors in the fingertips made from conductive thread allow the user to interact with their device by tapping their fingers.

Challenges I ran into

Bluetooth integration with iOS was difficult, so we decided to go with a cloud-proxy implementation instead. This ended up improving the system's reliability and power performance, with the gesture recognition algorithm running in the cloud instead of the wearable.

Accomplishments that I'm proud of

Changing the music with a flick of your hand, pausing by making a knock gesture... Interfacing with the completed device felt very intuitive and smooth.

Share this project: