What is it?
Using the Intel RealSense, we first extracted a user's hand movements and mapped certain gestures to specific actions for the drone. Then we transmitted the data to an Arduino using low energy bluetooth. The Arduinos generated the signals readable by the transmitter chip using digital potentiometers. Having never used FPV before, we included a GoPro with a 5.8 GHz transmitter to stream live first person view to a wireless monitor.
We have always been interested in drones and their increased prevalence in our culture. We love reverse engineering commercial products, and we wanted to experiment with the Intel RealSense.
How We built it
In order to extract gestures, we used the Realsense SDK in C++ to extract the coordinates on a user's hand. Doing so, we were able to match certain combinations of coordinates to different instructions for the drone. The camera first tracks where the user's hand is and will move the drone according to where that is. More complex custom instructions such as landing the drone and havering can be called by using two hands. The drone was flown by taking apart its transmitter and completely disassembling it. We used digital potentiometers to digitally and autonomously emulate the transmitter joysticks. Out homemade transmitter is controlled by four Arduino Unos, which was necessary because the digital potentiometers were controlled by SPI and we had to work with a limited number of SPI inputs in the Arduino Uno.
Challenges I ran into
The most difficult part of this project was emulating the drone control system. Due to the tight specifications of the controller, it was very difficult to create signals that fit within its tolerance boundaries. It took many hours of calibration for us to figure out exactly what signals were allowed. After that, we had to map controls to these signals. An additional challenge was generating data using Intel RealSense that was then transmitted to the master Arduino device using a Sparkfun low energy Bluetooth board.