Having always been inspired by the autonomous robotics field, we decided to create our own robot this hackathon. Last hackathon, we tried to take apart a small drone, but unfortunately, the drone got fried and it was really hard interfacing with it in the first place. This hackathon, we pre ordered the parts for the drone the week before and took all the parts into the hackathon to build. We decided to go for a IMU interfacing route (our glove) to control a drone.
What it does
We have an IMU that is connected and read by Unity through serial. This allows to do two things. First, we are able to run our filter data on Unity instead of the Arduino, so any latency for data collection is minimal and we let the computer handle the calculations. Secondly, we can create a modular system for VR/AR applications that interfaces with sensor hardware to show that we can do it and scalability for later projects. We then take the rotational data out from the IMU and let the raspi on the drone handle the necessary calculations to find out how to turn to match the IMU.
How we built it
Our team split into two groups: builder and coder. For the past 24 hours, the builder (Benjamin Nguyen & Banyan Nguyen) scoured SLO for spare parts we could use for our drone frame, and proceeded to create a roboust and sturdy frame that would be capable of surviving the many crashes that would be thrown at it. The Coder (Jeremy Li) focused on creating the code needed to interface with the raspi, and general motor control codes which are important in making a functioning drone.
Challenges we ran into
One of the hardest challenges that the builders went through was a "first-time" problem. With little experience in the field of drones, they scoured the web for tutorials and guides to try and piece their drone together. Their problems were computed by the fact that they spent 3 hours cadding up a model, but none of the 3D printers at Cal Poly could print them in time or were inaccessible. They then headed to the Simpson Construction Building to recycle some scrap wood by putting it to use in the frame. Eventually, at 6 am, they had a functioning drone that could receive motor commands and was powerful enough to lift off. For coders, networking was the biggest problem, and still is. Being only high schoolers, we have limited access to this type of knowledge and it was definitely a learning experience. Having the raspi talk with our computer was a major problem, and still remains a bug in our code.
Accomplishments that we're proud of
We are definitely proud of using really unorthodox methods and materails to materilize a drone in the past 24 hours. Much time was spent trying to obtain power tools and heavy equipment that we could use to create our frame. Luckily, the drone was completed on time at we are super proud that it works, and it can lift off.
What we learned
There were a lot of things that we learned, among them is the installation of drone motors and props as well as how ESC's function. It was fun and rewarding to be able to use a power supply to directly control our motors and also using a raspi as our flight controller instead of a regular one. This creates a more IoT friendly network and so we can interface with it easier than we could with other, more specilized fpv drones.
What's next for IMU interface for Drone Control, Motion tracking glove
We're definitely going to continue this. After getting a drone to interface with the IMU, we will be trying to build a robotic arm that does the same next. After that we will be moving on to a more complete version of the glove where sensors on the hands can articulate more fine, precise movements.