Abstract

Our goal was to create a more user friendly experience for flying a drone. Instead of the conventional remote control, our aircraft receives instructions from the user's hand movements.

The Idea

The idea for this project came about from the discovery of a training port on the back of the drones remote control. This port allows two transmitters to be connected together for training purposes. After analyzing the signals of the transmitters, we were able to generate an identical signal which simulates a trainee signal.

Implementation

We constructed a pair of gloves attached with a BNO055 absolute position sensor. This position information is then accessed over I2C bus interface by our Atmega 32U4 microcontroller. Appropriate signal is generated from the processed information and sent over the transmitter to the quadcopter. This arrangement allows a user to fly the quadcopter using only their hands. All the code for this project was written using AVR-C, debugging was done over USART.

Challenges

The first struggle our team ran into was not allowing the sensors enough time to process all data. The second difficulty was our exponential control function model. The calculation would have taken too long, so we constructed an array look up table to easily grab the tabulated values. We then ran into the problem that this array was too large for our 2.5 kB RAM. We had to save this table in our program memory and then access it during run-time.

Features

  • Interface enables user to interact with a quadcopter more directly, making the flying feel more natural.
  • Platform-independent interface - The interface we constructed is capable of controlling any radio-controlled device (plane, helicopter, quadcopter, car, boat).
  • Fully customizable mapping function model - Our final draft of code uses a exponential model for our controls. This mapping offers greater stability and flight control by desensitizing small movements near the center (origin).

Built With

  • avr-c
  • c
Share this project:

Updates