We wanted pebble to be more reactive for someone traveling.

What it does

Based on your location, it creates a custom watchface showing a selected image from your surroundings. The image is selected by our algorithm that tries to find the image that best matches your environment. It also finds images that match your current time, i.e. sunsets, nights, etc.

How I built it

The Pebble watch component was built in C with Pebble's C SDK. The server and the REST API was built in Python using Flask for the REST API that communicate with the Android app and Clarifai API to do the image processing. The android app was built using Java.

Challenges I ran into

The only image type pebble supports are bitmaps, and only of a 148x166 size. So any image gotten from Instagram had to be converted to that first. It also does not have a smooth way to transfer images over bluetooth from the phone to the Pebble watch, unlike transferring text, so we had to build our own transfer methods.

Accomplishments that I'm proud of

We were able to successfully convert new images to bitmaps and cut them up into bytes and transfer them over bluetooth to the Pebble watch.

What I learned

How to program watchfaces for Pebble in C. How to transfer data in chunks and reconstruct the info later. Image processing.

What's next for TravelFaces

Add weather information to the watchface. Add weather information to the image selection algorithm.

Share this project: