Inspiration

The original inspiration came from an aunt with degrading vision. This led to a simple question of if we can have cars auto-navigate, why can't we build hardware to help the blind navigate? If we can, is it possible to do it cheaply?

What it does

It starts simple by using a scanning array of ultrasound sensors to provide aucostic feedback about what is seen in the environment. In essence, using a similar system to those used by the common Roomba robots.

How I built it

The unit is a wearable head band pulled from a headlamp. The sensors are housed in custom 3D printed shells. The aucoustic feedback is via speakers that are located behind the ears as not to minimize a person's ability to hear. Ultrasound sensors feed data to an Arduino processor. A MIDI musical instrument shield is used to generate audio queues.

Challenges I ran into

So many... Acquiring environemnt range, velocity, and acceleration data proved trivial compared how to present this information to and user in a menaingful manor. The original goal was to use an accelarometer to correlate samples in time to better queue when targets of concern enter a field of view. Time ran out before this could be explored.

Accomplishments that I'm proud of

Combining 3D modeling, circuit building, programming, and getting something working in the short time required.

What I learned

Audio may not be the best means of conveying information. Vibration or other Haptic methods may be better suited.

What's next for eySonos

The next steps fall into two catagories now that I have a working base,

1) Continue to explore options for providing queues to a user. 2) Collect data and see if detection, tracking, and machine learning can be used to classify targets in the environment.

Built With

Share this project:
×

Updates