We wanted to see a world where the blind could travel through the streets safely and easily. No walking sticks, no service dogs, all technology .
What it does
Essentially, it guides a blind person and gives haptic feedback. Indicating to the blind person whether he or she will run into an object. The haptic feedback includes vibrations that change in frequency depending on the location of the object (animate/in-animate). There are 4 different distances, which produces 4 different frequencies. The closer the object, higher the frequency.
How we built it
Challenges we ran into
Wrote a whole API, just to end up using Firebase. Ended up re-writing our entire code to work with pebble. Switched from the SDK to pebblecloud.
Accomplishments that we're proud of
Capturing very accurate data from the Kinect. Connecting that data through a server, then to the pebble watch in a matter of milliseconds.
What we learned
What's next for Visionar.io
To make the application mobile. Have camera glasses that uses infrared to determine depth, which connects to two wristbands via bluetooth to make them vibrate.