Inspiration

What it does

TrackIt! is a universal add-on for tripods that keeps the specified target in view of the camera. Any object can be selected, from humans to animals to even soccer balls. This allows for hands free recording. Gone are the days where people must view the world through a digital lens while attempting to follow a person place or thing.

How we built it

TrackIt! is divided into two parts: Object Tracking and Servo Control

Object Tracking Object Tracking is done through computer vision and the implementation of deep learning tracking algorithms which can accurately track user-selected regions, in an efficient manner. After determining where the object is in relation to the center of the screen, instructions are sent to the Flask server to deliver to the Servo Control units.

Servo Control Servo Control is handled through two Raspberry Pies. Using the built-in GPIO, one Pi can effectively only control one servo at a time, which would have limited the scope of the project. The solution was to have two Pies, one responsible for vertical movement, and the other horizontal. The Raspberry Pies read in information from a Python Flask server to determine where to move to.

Challenges we ran into

The original workflow plan did not work due to the fact that the WiFi in the area was simply too slow to handle real-time video streaming, an essential part of the project. The original idea was the outsource the heavy lifting of image recognition and object tracking to a powerful GPU in the cloud, but the plan was scrapped due to video not uploading fast enough on the network. Instead, all of the work was done locally, which causes the performance to take a hit, but still performs far better than if the online model was stuck with.

We realized pretty quickly that trying to control two servos on one Pi running directly off of the power of the GPIO board was a bad idea. Both servos constantly locked up, and it was impossible to truly have diagonal direction, and we attempted to simulate it by alternating small movements between the servos to fake it. However, this did not yield a satisfactory result. Many ideas where thought of, including using a breadboard and external power supply, and even using an Arduino as a source of power. Eventually, the idea of 2 Pies where used due to the appeal of easily being able to run the motors in sync, without having to alternate currents and frequencies as a singular Pi would have had to done. The end result works pretty will, as TrackIt! can go up, down, left, right, and diagonally.

Accomplishments that we're proud of

We are not hardware people, and all of our knowledge is in software, so it was very rewarding to stumble through the wonderful world of hardware and figure out what each thing does. This was the first time that we used a modular design, with multiple tools being designed at the same time and eventually being put together and magically working. It was amazing to see that TrackIt! succeeded, given how ambitious it was.

What we learned

We gained a lot of knowledge about live video streaming, even if it didn't help for this project in the end. It is always useful to have more in the toolkit for next time. Additionally, we gained a basic understanding of breadboards, the GPIO, and how the Pi can be used to interface with physical devices.

What's next for TrackIt!

A faster tracking and movement speed for the camera can easily be achieved through a better GPU handling the loads. Right now, approximately 2 or 3 frames are processed per second, but even a mid-level consumer GPU could bump that number up to nearly 100 frames per second. Additionally, the project will be taken to the cloud, where there is more freedom and we will no longer be restricted by hardware, allowing the small quirks of the prototype to be minimized.

Built With

Share this project:

Updates