Inspiration

It all started when we saw the results of a survey that asked a thousand people to review a series of health conditions and pick the one that they believed would have the greatest impact on their quality of life. It turns out that blindness was chosen by the overwhelming majority, outranking conditions like HIV/AIDS and cancer. Today, there are nearly 285 million people across the globe that are suffering from serious visual impairment, with an estimated 40 million that are completely blind. OrionHacks presented us with an opportunity to develop a solution to empower this group. More specifically, we wanted to create something that made life easier for them, so we decided to focus on navigation.

But we were unsure where to start, so for inspiration, we considered a little creature that uses a sense besides vision to navigate: the bat. To navigate cave environments, bats use echolocation. In this process, they emit sound waves and wait for them to bounce off of obstacles in their environment, giving them information about their proximity to various surroundings.

With a few days of hard work, circuit frying, and coding, we were able to develop a low-cost, low-power, and practical solution to the aforementioned problem.

What it does

Our device, echoSense, enables the user to gain the ability to use echolocation. This allows blind individuals to more effectively, quickly, and safely navigate environments.

When the user presses the trigger button, the sensor emits a sound wave, which can bounce off of an obstacle and return back to the sensor, triggering it. By comparing the timestamps between when the sound wave was emitted and when it returns, and multiplying it by the speed of sound in air, we can derive a very accurate measurement about the distance to various objects in our environment.

To communicate this distance to the user, the echoSense’s microcontroller chip uses bluetooth to talk to a haptic feedback interface that allows the user to perceive this distance in a way that makes sense. We used the Neosensory Buzz haptic feedback device to accomplish this task. Based on the distance detected by the sensor, the motors on the haptic feedback device will vibrate at a different relative intensity.

How we built it

The hardware components consisted of an Adafruit nRF52840 microcontroller, an HC-SR04 ultrasonic sensor, a 3.7 V 350 mAh lithium ion battery, two 2.2 kiloohm resistors (to convert 5V output from ultrasonic sensor to a manageable voltage), and copper wires. The components were all soldered together.

The software was developed in C++ through the Arduino IDE. We also made a custom modified library for the HC-SR04 sensor to implement in our project.

Challenges we ran into

We faced hardware challenges converting the 5V logic of our ultrasonic sensor to the 3.3 V logic of our microcontroller, but by cleverly hooking up some resistors, we were able to get our device to work with full functionality.

Accomplishments that we're proud of

We're very glad that we were able to build a project that effectively addresses a huge global issue. We think that our prototype is very viable, and with some more work, can actually be implemented as a low-cost, low-energy device to make navigation easier for the visually impaired.

What we learned

We learned a lot about modifying/making Arduino libraries for sensors, and about how to leverage resistors to allow electronic components with different operating voltages to function together. We also learned good practices for making haptic feedback understandable by the user.

What's next for echoSense

We think that we could really take echoSense to the next level by integrating it with AI powered computer vision (perhaps through a Raspberry Pi or Nvidia Jetson Nano). This could further empower the blind with better, more advanced solutions for navigation.

Built With

Share this project:

Updates