The visually-impaired have to face challenges that the average person doesn't have to worry about, such as navigating a room, or how to buy groceries. We want to give them as much feedback of their environment as we possibly can.

What it does

It detects objects in front of the user, and if an object is detected, the headband will provide directional feedback through a buzzing on their head. There is also a sensor/feedback module on the ankle to detect ground objects. The user can press a button that will vocally describe what is in front of them.

How we built it

We use lidar and sonar sensors on the headband to detect obstacles in front of them. If an obstacle is detected within a certain distance, haptic sensors located on the inside of the headband and ankle module will activate to warn the user that an object is in their path. The user can then press a button that will use google-cloud's deep neural network running on their vision API to vocally transmit information to the user about their surroundings.

Challenges we ran into

Limited number of pins per board, wiring, fitting components to human head, sewing

Accomplishments that we're proud of

Learning how to sew

What we learned


What's next for BlindEyes

Creating a complete housing with custom boards to reduce the complexity of having multiple components all wired together.

Built With

Share this project: