What it does
Using the depth information obtained from a single camera's frames and an accelerator, howfar acts as a haptic-feedback ruler, allowing the severely visually impaired to perceive objects from further and in greater distant detail, while being able to be run on older Android devices being sold for under $60 online.
How we built it
Challenges we ran into
Google's ARCore software is made from the ground up for AR first and not information collection. As such, I had to come up with certain workarounds for "vision-first" features. Namely, the depth image is written to an offscreen texture instead of just being bound to a view, allowing the app to retain full GPU acceleration while still saving a bit of battery.
Accomplishments that we're proud of
I'm quite proud of the final vibration intensity curve. Linear distance to intensity mapping made it quite difficult and muddy when trying to determine the distance of objects through the haptic feedback, but the curve eventually settled on ($\frac{30000}{\text{distance}}^{\frac{1/8}} * 255 + 255$) offers a nice balance between short range sensitivity and medium range nuances.
What we learned
Android APIs are made to be called from Java, and the NDK is an afterthought. Don't go solo when learning a new platform in a time sensitive environment. Build tools for the blind because they can't experience your UI mistakes.
Log in or sign up for Devpost to join the conversation.