Coding for Good: Health and Wellness

Over 20 million American adults have some form of vision loss, and the WHO estimates that at least 280 million people are visually impaired worldwide. Currently, most buildings and campuses are not disability-friendly and do not allow for easy navigation for those with vision loss. Assistive devices can call unwanted attention or can be limited to a small range of movement. In addition, these products also rely on the user's intuition or inference and can be less helpful in new environments where the user is not familiar with the layout of the building.

Project Tango “sees how we see”

This Google smartphone/tablet project uses a Point Cloud Library to map a space based on xyz coordinates of each point in the device's range of vision. It has applications in virtual reality, area learning, and indoor wayfinding.

Visualizing Unfamiliar Environments

Over the course of 20 hours, we created an android app utilizing Project Tango's Point Cloud Library that was able to differentiate between obstacles, walls and corners, and ascending and descending staircases using depth patterns. We designed this app with the visually impaired in mind and used text to speech to give auditory directions to the user to help them navigate new buildings.

Implementation

Using the Point Cloud Library, we created a digital representation of the immediate physical space around the user. We then extracted coordinates from the Point Cloud Buffer and clustered depth readings into different sectors. By analyzing the relative and absolute depth of different sectors, we were able to establish threshold values and algorithms to differentiate between various obstacles and determine the commands that were given (left, right, wall, obstacle, upstairs, downstairs, okay). Voice commands were implemented using Android's Text to Speech.

Challenges

Extracting depth values from Point Cloud Buffer, setting accurate threshold values for different objects, and implementing a "slope" threshold for staircases that would set them apart from walls or simpler obstacles.

Accomplishments

Using our application, a blindfolded person was successfully able to navigate staircases and obstacles (see video). In the future, this app could be utilized by the visually-impaired to walk through buildings and receive real-time, predictive, auditory aid without having to rely on other people, animals, or tools.

What's next

Apply the app to a smaller device such as an Android smartphone in order to minimize hassle. Refine communication from device to user to make the device more subtle, possibly using bluetooth bands to vibrate based on the next steps the user should take. Implement location-based services to give directions from room to room within a building that the user has never visited using a combination of satellite GPS and provided floor plans. Use area learning to store and refine routes over time if the user revisits the same place for future reference. Deal with challenges such as sunlight interference and crowded spaces. Design a practical physical implementation of the app that the user could wear in a hands-free manner.

Share this project:
×

Updates