Inspiration

Recent natural disasters happening around the globe inspired us to pursue this project. We thought non-ground-bound method is especially helpful in rescuing people. There are some ongoing drone rescue projects, but we wanted a more realistic interface through AR lens.

What it does

The AR lens is programmed to monitor the drone. Instead of using the buttons on the original controller, the user simply uses hand motion holding onto the Vive controller to manage the drone position. AR lens shows a holographic display of Vive controllers in full screen and the drone’s view in the corner.

How I built it

We used the Parrot AR.Drone 2.0, along with the HTC Vive as a control system. In Unity, we developed an interface to connect with the drone and control it with body motion while monitoring its position through the Vive’s front-facing camera.

Challenges we ran into

Working with the drone required us to deal with issues of tracking and stability. The drone itself had a tendency to fly off-kilter without natural light, which impaired our ability to develop for it. However, the Parrot AR.Drone SDK 2 provided a fairly smooth developer experience.

For tracking, we originally intended to place a Vive Tracker on top of the drone. However, the limited size of the tracking space made this infeasible.

Retrieving drone data and controlling the drone in a completely new approach was challenging. Tracking hand motion and transferring that data to the drone was a critical part in the project.

Accomplishments that I'm proud of

The effect of this project is very meaningful because this can be applied towards real-world problems. This product can be used to search for and interact with survivors in natural disasters.

What we learned

We learned to harness the power of augmented reality in real life problems. AR is becoming more and more promising as it improves our lives. As we saw in the project development phase, AR applications can truly be used in humanitarian causes. It is more intuitive for the user to use hand gestures rather than using physical tools when controlling the drone. The drone’s motion capabilities are affected by natural light. The drone is stable during the day compared to using it at night.

What's next for Game of Drones

We want to improve both the AR lens managing program and the drone hardware. We would like to use computer vision algorithm to map the land for automated drone navigation. We would also like to model the drone’s view so the application of the project expands to land analysis. Extensive features such as voice commands can also be used. For more reliable, interactive performance, we can install and control the mics and speakers on the drones.

Built With

Share this project:
×

Updates