My goal for this Hackathon, when i was applying was to create something awesome that can better the world. During the beginning of the Hackathon, i discovered there was a visually impaired fellow hacker that had frequent trouble getting around. Even with a walking stick as an aid, it only had the ability to reach so far, and can only determine what is in his surroundings if the stick were to make contact with something. My thought was, what if we can illuminate the inconvenient "whacking" stick, and replace it with something even BETTER! So i decided with the Arduino kit i just bought the day before the Hackathon, and put it to good use.

What it does

The Self-Driving Human Assistance is an aid for the virtually impaired that helps illuminate the use of a walking stick with an ultra-sonic sensor that scans the room/ surroundings for obstacles up to 4 meters away and notifies the personal which direction is best to walk in. The output is a line of LEDs that is mounted on a pair of sunglasses and according to how far right or how far left the led lights up to, is the direction that is clear for walking. The reason for this is because the output is designed for the partially impaired like Tim who is blind in one eye and can only gaps a quick light gimps in the other.

How I built it

The program is programmed with C++ on the Arduino development software. The hardware consists of many electrical components wired onto the breadboard and mounted onto a cardboard head set.

Challenges I ran into

Challenges i ran into was implementing SLAM into the functionality of the sensor as well as the processing speed. Another problem i ran to initially was the fact that i had bought my Arduino kit from a knock off store, it lacked couple software components, but once i found out the issue and installed it back on the Arduino, it worked perfectly!

Accomplishments that I'm proud of

Hackathon Goal Completed.

What I learned

How to use and program an Arduino, and the many different capabilities it has to control cool neat electronic devices like the ultrasonic sensor, servo motor, 4bit - 7 segment, electric motor and much much more.

What's next for Self-Driving Human

Making it more less hacked (even-though it was built at a Hackathon with whatever i can find) Enable it to 3D scan the room/ environment at a more efficient pace Make the output more universal for the virtually impaired personal

Built With

Share this project: