Inspiration
We think it would be really cool if people can control the drones with their own hands(or bodies), free of any form of controllers.
What it does
Cockpit allows users to control drones(Parrot AR 2.0) using their hands (right wrist in particular). The drone will try to track down and follow user's hand position in real-time.
Demo
Dependency
- python3, node.js
- tensorflow 1.4.1+
- opencv
- argparse
- numpy, scipy, matploylib
- git+https://github.com/ppwwyyxx/tensorpack.git
- npm install ar-drone
How We built it
We utilized tf-pose-estimation(https://github.com/ildoonet/tf-pose-estimation), a deep pose estimation application implemented using Tensorflow that supports real-time human pose estimation through the webcam(or in our case, drone camera), to extract and recognize human motion and pose. To feed the raw video stream to tf-pose-estimation, we also used ar-drone(https://github.com/felixge/node-ar-drone), which is a node.js client for controlling Parrot AR Drone 2.0 quad-copters, to record and stream the video and send it to backend to analyze. After analyzing the pose, detecting the pose changes, and making the decisions, the decisions are sent back to ar-drone also in real-time to allow the drone to execute the motion commands, therefore, realizing the feature that users can control the drones using their bodies.
Challenges We ran into
- Sending the real-time video stream from the drone camera back to the backend so that we can analyze the video using opencv
- Establishing the communication between node.js and python so that we can send the decision information to the drone in real-time
- Coming up with the formulas that map user's right wrist coordinates to the direction in which we want the drone to move to.
Accomplishments that we are proud of
We conquered all the challenges we ran into successfully.
What we learned
- How to use some awesome computer vision and drones-related packages.
- How to connect drone and backend.
- How to connect node.js and python, and establish the communication.
- Realize the importance of the relative motions between drones and people.
What's next for Cockpit
What we can add next is to allow users to control the drones in vertical axis as well. Also, to improve the user experience, the speed of the drone can also be adjusted according to user's body motion. The new feature also includes letting the drone to rotate, or do other fancy motions according to human pose.

Log in or sign up for Devpost to join the conversation.