Cole Alban, Frank Pogoda, Okechi Onyeje, Shah-Ameer Wali


We wanted to see a world where the blind could travel through the streets safely and easily. No walking sticks, no service dogs, all technology .

What it does

Essentially, it guides a blind person and gives haptic feedback. Indicating to the blind person whether he or she will run into an object. The haptic feedback includes vibrations that change in frequency depending on the location of the object (animate/in-animate). There are 4 different distances, which produces 4 different frequencies. The closer the object, higher the frequency.

How we built it

Utilizes kinect's (v2) depth perception (camera and infrared) which uses Microsoft's Visual Studio in c#. We then processed the data points which determine the depth of the image (smaller integer, the closer the object). We take the closest point, and send it to the firebase servers, which then accesses the cell phone that connects to the pebble smart watch. (All using javascript)

Challenges we ran into

Wrote a whole API, just to end up using Firebase. Ended up re-writing our entire code to work with pebble. Switched from the SDK to pebblecloud.

Accomplishments that we're proud of

Capturing very accurate data from the Kinect. Connecting that data through a server, then to the pebble watch in a matter of milliseconds.

What we learned

How to use c# with the kinect. Javascript could've been used for our whole project. Cloudpebble is much better than the pebble SDK.

What's next for

To make the application mobile. Have camera glasses that uses infrared to determine depth, which connects to two wristbands via bluetooth to make them vibrate.

Share this project: