Inspiration

Approximately 39 million people in the world are legally blind. All these people, especially the elderly, face serious risks in their everyday lives. Our team explored possible ways to improve the quality of life of these specific groups of people, and created Touch+.

What it does

Touch+ utilizes haptic feedback to allow users to better handle their surroundings. An I.R. distance sensor is attached to a custom 3-D printed mount below the wrist, and acts as a substitute for a traditional cane. Depth data gathered from the surroundings by the I.R. sensor are interpreted by a raspberryPi in the users pocket. After analysis, data is presented to the user by way of pulses on a Myo armband.

How we built it

Our system, at its core, is a dynamic interface. It allows many pieces of technology with huge potential to work together to create the best overall experience for the user. This experience starts with the Myo arm band, which can turn the entire system on with a simple flex of the wrist. Positioning data from the arm band is then sent to the rasPi, which decides on the correct postioning algorithm to use for the user's particular orientation in space at the moment. Data from the I.R. sensor, prototyped in this project by an xBox Kinect, is then gathered and transformed into a frequency. This frequency is sent back to the user's Myo, and is presented to the user via the Myo's haptic feedback features.

Challenges we ran into

Interfacing so many different devices caused an entire host of challenges to occur. For instance, communication between incomplete or closed implementations of different products forced us to need to develop a brand new dymanic run time communication pipeline to maintain data flow and decrease communication latency.

Accomplishments that we're proud of

We have developed a portable, lightweight navigation system to aid the visually impaired. The communication latency between all devices in the system is nearly 0, and thus, the user relieves fast and accurate feedback about their environment. In testing, participants had no issue navigating twisting and turning hallways with their eyes closed. We attribute this success in testing to the optimization of feedback based on the position of the user's arm in space, which we gather using the Myo. Users can seamlessly transition from feeling for subtle bumps by pointing towards the ground, to search for far off walls by pointing directly in front of them. Finally, these positioning algorithms are entirely I.R. based, and thus, work just as well in complete darkness.

What we learned

Every team member came out of the project with a much more knowledge of computer vision and hardware interfacing than they came in with.

What's next for Touch+

We would like to lose the prototype kinect sensor for a less bulky I.R. or L.I.D.A.R. sensor. We would like to continue to utilize the Myo armband, and further optimize the user experience by refining the dynamic arm position algorithm transitions.

Share this project:
×

Updates