Inspiration

For the visually impaired, one's only real option to navigate their world involves the use of a walking cane. However, these devices, like all others, deserve to be brought in and upgraded in the twenty-first century.

Originally two separate teams with similar ideas on how to improve the visually impaired's quality of life, we have joined forces to create a unique haptic system to visualize one's surroundings.

What it does

Blindsight utilizes a simple ultrasonic sensor isolating on a stepper motor, transmitting distances away from one's body, and communicating that information via an array of vibration modules.

How we built it

Blindsight was created with a Raspberry Pi, 3D-printed parts from the Hardware Lab, and of course, an infamous top hat.

Challenges we ran into

Tackling complex challenges on both the hardware and software aspects caused for a chaotic yet educational hacking experience. Our most notable challenges involved failing to get a Qualcomm device to connect to WiFi, and debugging vague wiring problems.

Accomplishments that we're proud of

We are proud to create a unique experience for the visually impaired. We were also ectatic when our minimal viable product allowed us to navigate hallways and rooms relying solely on haptic feedback.

What we learned

There was an obvious cost-benefit analysis game we had to play to balance the responsiveness of the tool with the precision the tool was potentially capable of. In the end, we favored towards an approximated experience that did not fully take advantage of the ultrasonic sensor's accuracy. This choice let the tool be more responsive to changes in the environment, even letting us know if someone walked in our path.

What's next for Blindsight

Blindsight has been an eye-opening experience, and future steps would involve miniaturizing the systems on the top hat, increasing the speed and responsiveness of the device, and integrating a more reliable power source.

Share this project:
×

Updates