Inspiration The inspiration for Clear Path came from realizing a critical gap in current assistive technology for the blind and visually impaired. While the white cane is the gold standard for navigating ground-level obstacles like curbs and steps, it has a dangerous "blind spot": the area from the waist up. We learned that many blind individuals suffer injuries from "head-level hazards" (hanging tree branches, protruding street signs, or open truck tailgates) that a cane simply cannot detect. We didn't want to replace the cane; we wanted to complete it. Our goal was to build a "Third Eye" that guards the upper body, giving users the confidence to walk without fear of the unexpected.

What it does Clear Path is a wearable assistive device that detects obstacles at chest and head level. It acts as a real-time proximity guard, translating distance into intuitive audio or haptic feedback. It continuously scans the area in front of the user using ultrasonic waves. Instead of complex speech, it uses an "Inverse Proximity Algorithm."

  • Safe Zone: Silence.
  • Caution Zone: Slow, low-pitched beeps (or gentle pulses).
  • Danger Zone: Fast, high-pitched alerts (or rapid vibration). We built a companion React Dashboard that connects via USB/Web Serial. This allows a sighted friend or engineer to see the raw sensor data and "Status Level" (Safe/Warning/Critical) on a screen in real-time, effectively visualizing what the device sees.

How we built it We built Clear Path using a combination of embedded hardware and modern web technologies. The Hardware:

  • Arduino Nano/Uno: The brain of the operation, handling sensor timing and logic.
  • HC-SR04 Ultrasonic Sensor: Chosen for its reliability and acoustic ranging capabilities.
  • Piezo Buzzer / Vibration Motor: For providing immediate, zero-latency feedback to the user. The Software (C++):
  • We wrote a custom C++ algorithm that filters out sensor noise (using a running average) and maps distance (cm) to frequency (Hz) and tempo (BPM). This creates a smooth "analog" feel rather than a jerky digital on/off switch. The Web Interface (React.js):
  • We used the Web Serial API to allow the Arduino to talk directly to a Chrome browser.

The frontend is built with React, featuring a dynamic UI that changes color (Green/Yellow/Red) based on the threat level, serving as a "Mission Control" for the device.

Challenges we ran into

  • Initially, we thought about replacing the white cane. After researching and "acting" out the user experience, we realized this was dangerous and insensitive. We pivoted to augmenting the cane, which completely changed our design philosophy to focus on head-level obstacles only.
  • Ultrasonic sensors can be noisy, sometimes reporting 0cm or random spikes. We had to implement a software smoothing filter (a moving average of the last 5 readings) to stop the buzzer from glitching.
  • Constant beeping is annoying. We spent hours tuning the "Silence Threshold" so the device only speaks up when it actually matters, preventing sensory overload for the user.

Accomplishments that we're proud of

  • We achieved near-instant feedback. The moment you wave your hand in front of the sensor, the sound changes. It feels responsive and "alive."
  • Successfully coding the mapping function where both the pitch and the speed of the beeps change simultaneously. It’s a small detail that makes the device much easier to understand instinctively.
  • Connecting a piece of hardware (Arduino) to a modern web app (React) via the Web Serial API was a huge technical win for us.

What we learned

  • Designing for accessibility isn't just about sensors; it's about context. We learned that haptic feedback (vibration) is often preferred over audio because blind users rely on their hearing for environmental cues (like traffic).
  • We learned to work within the limits of the Arduino's memory and processing speed, optimizing our code to keep the loop tight and responsive.
  • Sometimes the best solution isn't to reinvent the wheel, but to make the wheel safer.

What's next for Clear Path:

  • We plan to replace the USB cable with an HC-05 Bluetooth module and build a React Native mobile app. This will allow the user to adjust sensitivity settings on the fly.
  • We want to integrate a camera and use a lightweight computer vision model (like YOLO or MobileNet) to tell the user what is in front of them (e.g., "Branch," "Sign," "Person"), not just that something is there.
  • Adding two more sensors angled at 45 degrees to create a protective "cone" rather than a single line of sight, preventing the user from clipping their shoulder on doorframes.

Built With

  • aurduino
  • c++
  • sensor
  • ultrasonic
Share this project:

Updates