EyeGuide was inspired by individuals who have difficulty seeing. I, Anish Aggarwal, have a dad who suffers from Keratoconus. It is a disease where the eye becomes curved resulting in difficulties seeing. When he was diagnosed 5 years ago, he was a regular individual without any glasses. However, within a span of 5 years, his eyes deteriorated and now he needs to wear glasses that are half-inch thick. His trouble with vision has inspired our team to create innovative EyeGuide.

What it does

EyeGuide has two distinct elements for your convenience. Ultrasonic sensors and a camera. When the sensors detect that you will be crashing into an object, they will let you know through vibrations in the gloves. This will inform you if you need to stop, turn left or turn right. If you get confused, there are buttons on the side of the belt for camera functions.

1) Call for help: You might find yourself stuck somewhere or confused and need someone to have a look around for you. This button sends a text message to your contact, giving them a code they can enter on our website to connect with you. You would be able to voice chat, while they see what you would see.

2) Read writing: If a sign doesn't have braille, or if you simply want to scan for words - click this button and it will read to you any writing focused in vision.

3) Detect object: When unsure what an object is, simply click this button for it to tell you the best matches.

How we built it

This project uses sensors, a camera, two pi's, and two of our servers. The gadget belt has a pi and an Arduino, allowing you full mobility. The pi connects online and talks to our other components via requests. Besides the Arduino IDE, we worked with python and flask.

Challenges we ran into

Connecting all the different hardware in a short amount of time was very stressful, as many sensors and components did not work very well. The Arduino 101 was an issue in development as it did not work well with our Servo, so we had to switch to an Arduino Uno. One large challenge we ran into was an inconsistent network issue that made our speech not as clean and crisp as we wanted to be. GPIO port interference was another issue that caused our motors to be slower at first. It was fixed through direct line connections.

Accomplishments that we're proud of

We were able to make an apparatus that helps individuals with impaired eyesight navigate around objects, detect other objects, analyze text, and more in under 36 hours. We were able to use a lot of interconnected hardware to make our apparatus great.

What we learned

We learned the importance and integrity of utilizing hardware in a hack. We also realized that there are many problems with incorporating different types of hardware and that it is vital to have secure connections between them. We learned that different types of hardware and software that work together and some that do not. For example, the asynchronous voice on one router, the Arduino 101, the raspberry pi and the ultrasonic sensors and much more, Being able to connect hardware to software is what makes a real innovator in this day and age.

What's next for EyeGuide

While this is just a prototype, building a legitimate and practical solution that can be worn is something we want to see happen.

Share this project: