One device controls the camera, and one device controls the motion of the drone.
Controller is designed so that up and down buttons can be used without looking.
View from the drone.
Imagine that, when you were driving, your eyes were fixed facing forward so that you had to change your direction of motion to look around. This is what it is normally like to fly a fixed-camera drone.
Normally, a user directly controls the pitch, yaw, and roll of a drone, and all motion is relative to the direction of the camera.
In our app, the user simply turns the viewing device in the direction he wants to look, and the drone faces that direction. The user points a separate controller in the direction they want the drone to move, and the drone moves in that direction, even if it is completely different from the direction the drone is facing!
How it works:
Pitch and Roll: To make the drone move in whatever direction the user is pointing, we came up with the following formula for pitch and roll:
Theta = ControllerAzimuth - DroneHeading
AbsoulteForce = tan(AbsoluteTilt)
Pitch = aTan(AbsoulteForce * cos(Theta))
Roll = aTan(AbsoulteForce * sin(Theta))
Yaw: There is no built-in command to make the drone face a certain direction, so we automatically adjust the rate of yaw until the drone is facing the same direction as the controller. This required slowing the yaw as we approach the correct heading in order to prevent oscillation.
Calibration: We use three calibration values for the azimuth of the controller, viewing device, and drone. Calibration is user friendly: the user simply points all three devices in the same direction and presses “calibrate.”
What we used:
- ARDroneSDK3 is used to control the drone.
- The Android Sensor Orientation Library is used to integrate gyroscope, magnetometer, and accelerometer data.
- Data is sent from the controller to the viewing device over Bluetooth sockets.
- Lots of trigonometry, and lots of spinning around in circles staring at numbers on our phones.
What we learned:
Drones are awesome. Working with three-dimensional orientation data in three different frames of reference is hard. The Android Sensor Orientation Library is very powerful.
What’s next for RubberNeck:
Use the viewing device in a VR headset, and integrate alternate pointing devices such as a Myo Armband.