Inspiration

All the members of our group have friends or loved ones living with disabilities. We know how hard day to day life can be for people living with these conditions and through this project, we hope to help improve the lives of the visually impaired. While many solutions to navigation without sight exist, the goal of BlindSight was to create a 21st-century solution to this challenge.

What it does

Blindsight consists of two main components: proximity detection and QR code scanning. The proximity detection runs on a nodeMCU and uses an ultrasonic sensor to transmit data to a server. This data is processed by an algorithm to transform it into frequencies that are played in the user's ear. Higher pitch represents a closer object, and therefore a greater risk. In addition, BlindSight aims to help the visually impaired navigate through public spaces. Using QR codes, a building owner can mark important locations and provide directions to help people find their way around.

How we built it

The proximity detection module consists of a nodeMCU, a mini breadboard, and an ultrasonic sensor. When the device is turned on, the nodeMCU connects to the network and the ultrasonic sensor starts measuring distance. This data is then sent to a server which communicates via socket.io with a camera-enabled smart device (phone, tablet, etc). Here, the distance data is transformed into frequencies and played using Tone.js. In addition, the camera is used to process QR codes and play their messages as audio.

Challenges we ran into

For security reasons, phones cannot access camera data via the web over unsecured networks. This problem usually is solved easily by using a web server secured with SSL. However, our webserver kept crashing, so this option was not viable either. To solve this, we used a secure clear HTTP proxy to cirucumvent these issues and allow our devices to access their cameras. In addition, getting the nodeMCU to communicate with the server was difficult because it is temperamental to sustain a multi-network open communication channel due to the numerous points of failure.

Accomplishments that we're proud of

BlindSight has been an idea of ours for over a year. We are very proud to see it finally become reality today. The part of our project we are most proud of was when we finally got all our individual components talking to each other despite the challenges in setting up the multiple methods of communication and the necessary protocols.

What we learned

From this project, we learned how to use a node.js server to facilitate data transfer between different hardware. In addition, we learned how to integrate audio into our web apps, including how to synthesize tones and voices. Finally, we learned how to use the nodeMCU board to perform Arduino-like electronics commands as well as how to communicate wirelessly over a network.

What's next for BlindSight

In the short term, we would like to switch our location markers from QR codes to a proprietary visual code system in order to prevent confusion from the general public. In addition, we would like to use directional audio to give the user a better sense of where obstacles are. In the long term, we would like to improve the robustness of BlindSight's object and location detection by using SLAM to map out locations in 3D.

Share this project:

Updates