Inspiration

Blind walking canes are only capable of spacial awareness. They cannot easily convey the objects that are in the environment in front of a person.

What it does

An Android application that anyone can download.

Uses computer vision to detect objects that are near the user and give them cues (audio and haptic) to let them know about these objects. For example, if a person is getting close it will give a heads up to the user.

How we built it

Built off of TensorFlow and Android. Uses pre-trained models to do real-time processing of what the camera is currently seeing.

Challenges we ran into

Getting computer vision working quick enough on a smartphone so it can give real-time updates to users.

Accomplishments that we're proud of

Computer vision on a mobile phone.

What we learned

We used CruzHacks as an opportunity to learn computer vision. We learned the TensorFlow library and the different types of neural networks and explored other computer vision libraries (OpenCV).

What's next for EyeDog

Training a model that is based off a data set of imagery from what cameras normally see when walking.

Integration with Google maps to provide better walking directions

Improved haptic feedback (vibrations)

Built With

Share this project:

Updates