Inspiration

In the age of computer vision, machines can see in potentially more detail than the human eye. For the visually impaired, leveraging technology like this could give them alternative ways to perceive the world in alternative ways.

What it does

The goal of this project is to give the visually impaired a system to "see" the world through through two other senses: touch and hearing. Through CV, the user can get immediate touch feedback to simulate "warnings" of nearby objects so that they will know what to avoid. In addition, they can get an audio description of their environment at will, giving them a more complete picture of their environment.

How we built it

We used an arduino to control touch the touch feedback. For this prototype, we're using a Kinnect to get the image and motors to create vibrations. We used OpenCV to process the image and detect objects. We also used GCP Vision AI to analyze images for audio feedback.

Challenges we ran into

For the hardware side, some of the tools burned out and we needed to replace what caused the vibrations. On the software side, the challenge was the limitations in how to give audio and tactile feedback in a way that makes sense.

Also, the fact that we're a 2 person team limited our productive capabilities. One of us handled the hardware/touch feedback and the other did the audio side with GCP APIs, but we didn't have enough time to actually connect the two pieces because we were focusing on separate tasks.

Accomplishments that we're proud of

We think we did quite a lot in one weekend for a hardware hack! We used a full stack of tools and are quite proud of what we accomplished in such a short amount of time.

What we learned

Because of the challenges we faced, we learned that when working in a team with a quick deadline it would probably be best to keep the connection between different parts in mind when developing. We should keep the whole user experience in mind as opposed to thinking about it only feature by feature.

What's next for Mason Vision for the Blind

With more advanced resources, we would obviously use more portable technology. For the vibrations, we envisioned this product to be small and perhaps be wearable with glasses so the feedback would surround the user around the head. Furthermore, with more research we would try to make the feedback as useful and intuitive to the user as possible, by perhaps changing the vibration feedback patterns and analyzing which objects are most relevant for audio feedback.

Share this project:

Updates