Inspiration

The essence of Pebble is to immerse the user and keep him or her more connected to the world than ever before. Our objective was to create an application that allows the user to feel the full extent of Pebble's goal.

How it works

There are two parts to this product. Thus I will be explaining how each part works separately and how they utilize one another. The first part we created is an Android application that freely communicates with the Pebble application we created. The Android application receives signals from the Pebble application indicating a specific gesture was made using arm signals. Using this information, the application based on options setup by the user, performs an action on the device such as turning on the flashlight, starting and stopping music, and more. On the other hand, the Pebble application contains the gesture recognizer. Using the raw data provided by the device's accelerometer, we can learn custom gestures that the user desires and recognize it in the future. In this way, these gestures that the user sets can trigger action on the Android.

Challenges I ran into

The largest hurdle was the data analysis. First, we considered a simplistic approach that sets phase offset constant but allows the speed of the gesture to vary. However, we discovered that this simplistic approach would cause massive uncertainty in the data analysis. Thus, we moved to a more difficult approach of cross correlating of the gesture data.

Accomplishments that I'm proud of

We were able to create a very immersive tool for Pebble allowing users to use gestures to control features on the their smart phones.

What I learned

We learned much about the Pebble SDK and the Android SDK.

What's next for Ripple

Ultimately we would love to create this into an API allowing for gestures to be used to control items on iOS and Android applications.

Share this project:

Updates