Inspiration

I've made accessibility products for the blind before, a braille bluetooth keyboard. I took that hack from a hackathon and developed it fully to the point of MVP and Beta testing at a local school for the blind

What it does

It's an assistive walking cane for the blind. It recognizes objects, what they are, their location, distance, and reference to the user. It then translates that data from text to speech and is relayed through speakers or headphones to the user.

How I built it

I built it using a Rapsberry Pi, a few custom Python scripts, TensorFlow, OpenCV, and AWS IOT SDK for the learning.

Challenges I ran into

True machine learning takes way too long to process within a 24 hour hackathon... Also, sensors kept dying.

Accomplishments that I'm proud of

Being able to get the project mostly done despite a lot of setbacks.

What I learned

I learned a lot about ML and AI and neural networks... This is one of the first big projects I have done in this area.

What's next for Seeing AI Cane

I would like to further develop it to the point of practicality and pair it with my existing accessibility device Brailletooth in hopes of putting together an entire accessability suite for the blind.

Built With

Share this project:
×

Updates