This code is for a MakeHarvard project that turns your backpack into a third eye for increased awareness of surroundings and safety. This is achieved with an XBox Kinect sensor and a notification bracelet that work together to enhance your senses both in the dark and light. The bracelet will buzz you with gentle vibrating motors when the kinect detects people in the frame.
There are a few steps required to enable this code to run:
Install the freenect package onto your computer from here: https://github.com/OpenKinect/libfreenect (instructions are detailed in the README file). Make sure to install the Python3 wrappers as this enables the Python code in this repository to interface with the kinect. Install the latest version of OpenCV Clone the repository: https://github.com/NVIDIA-AI-IOT/tf_trt_models into the root directory of this repository. Follow the instructions on the repository's README file to install the necessary dependencies and datasets. You can choose which model that the tensorflow will reference, but the default setting is defined by the variable MODEL in testing_kinect_with_tf.py as 'ssd_mobilenet_v1_coco'. Move testing_kinect_with_tf.py into the tf_trt_models folder.
When an Adafruit server is set up and configured in the testing_kinect_with_tensorflow.py file, run the testing_kinect_with_tensorflow.py with Python3. If successful, the python program should be pushing the number of people in the frame as an int to the server.
The sketch code is still in prototype version.