We wanted to build a piece of technology that could help the blind. Instead of seeing objects, one could hear objects that are close to them with something similair to echolocation.
What it does
The Echolocation Glove senses distance to an object by using ultrasound. Math is performed on a sensor's data to come up with a distance in centimeters. Then, a frequency is sent to a speaker that varies depending on how close an object. That data is then sent to a computer, which graphs the data. The data that is graphed could be used for a number of purposes.
How we built it
We used an arduino, headpohones, and an ultrasonic sensor to get the data from the environment. Then, we sent that data to a graph.
Challenges we ran into
The current version is limited. The sensors are not the best. Also, it was difficult to mount the glove due to limited resources. Another challenge we ran into was automatically graphing the data.
Accomplishments that we're proud of
Getting data from the environment and representing it in an audio and visual format!
What we learned
We learned how ultrasound, echolocation, and sonar works!
What's next for Echolocation Glove
Improve the sensors, fix some bugs, and properly mount the glove.
Log in or sign up for Devpost to join the conversation.