Kinect Vision Aid analyzes RGB and depth data from a worn Kinect sensor to provide useful information to blind people to help guide them in everyday life. Written in C++ and Processing through Java, Kinect Vision Aid offers three useful applications that describe the surrounding environment to a blind person through audio notifications. The first component constantly analyzes the live depth data and notifies the wearer when he/she is close to a specific object within a range infront of them. This way, they do not bump into anything. The second component captures the RGB data, and analyzes it with AlchemyAPI to figure out what objects are infront of the person. This helps to describe the world around the person. The third component uses the Kinect's body-tracking feature to notify the person of the positions of other human beings around them. They are notified if a person is present to the left, right, or directly infront of the sensor, and also how far away in feet.

Share this project:

Updates