Drones offer immense potential for integration with 5G technology. Every year there are more and more tools for developers to leverage the power of drones to create novel solutions to tackle challenging problems. When we learned about the 5G Edge Computing Challenge from AWS and Verizon, we immediately knew that we had to participate.

We entered the challenge with the intention of creating a powerful interface for leveraging computer vision AI models in real-time on drone video feed through 5G. We decided to focus on the high-need use case of emergency services, and got to work. It’s important to consider that while our solution is designed for this use case, it has additional applications far beyond just emergency services.

Inefficiencies in the Emergency Service Response

Whenever someone calls 911, it’s critical that the response from emergency services is as fast as possible-- in many cases, even a few minutes can be the difference between life and death. However, in many of today’s largest cities, busy traffic conditions can contribute to delays as long as 15 minutes. Additional delays also arise in scenarios such as fires; firefighters can only evaluate the fire and come up with an effective plan to put it out when they physically arrive on scene and see the fire.

Noticing these inefficiencies, we decided to develop an application that leverages the power of 5G and drone technology to offer a promising solution to this problem.

Application Features and Use Cases

1) Emergency Services With the LifeHawk mobile application, emergency services can not only deploy drones to emergency scenes (either with a pilot or through a built-in autopilot system), but also leverage the power of customized, state-of-the-art AI computer vision models running on the edge at real-time.

Here is a use case scenario. Police can deploy a drone to a scene immediately after they get a 911 call. While the police are driving to the scene, they can see AI processed footage from the drone in real-time. Let’s say that someone at the scene is carrying a gun-- this can be detected via the object detection model running on the edge. Now police are aware that there is an armed suspect before they arrive on scene, and can evaluate and strategize before they arrive since they can see the situation from the drone. This saves time, improves efficiency, and saves lives.

2) Commercial Activities Of course, LifeHawk’s functionality isn’t just limited to emergency services. Anyone can upload their own custom deep learning models on a G4 EC2 instance and take full advantage of the powerful features our application offers without making any changes to the application code.

For example, cell tower companies can upload their own deep learning model that detects structural anomalies in a cell tower when conducting maintenance. Testing the LifeHawk Application

The LifeHawk mobile application must be paired to a supported DJI drone before use. Luckily, our application supports a huge variety of drones, ranging from the budget DJI Spark and Mavic Mini to the commercial grade DJI Matrice. This makes our solution a very accessible and versatile option, that can adjust based on user preferences and needs.

After purchasing a prepaid Verizon SIM card, we tested our device over a 4G LTE connection in Las Vegas, NV, which happens to have an AWS Wavelength Zone. The drone we used for testing was the DJI Mavic Mini.

Software and Architecture Overview

The core of our solution is at the LifeHawk mobile application. This application was developed in Java for Android with heavy usage of the DJI Mobile and UX SDKs. The application allows the user to control the drone, both manually and through autopilot features, check on the drone status (GPS coordinates, battery level, etc.), adjust drone settings (flight speed, camera exposure, etc.) and also receive HD video stream from the drone upon successful connection.

Upon pressing the button labelled “G”, the video from the drone is streamed via the RTMP protocol over Verizon’s 5G network to our AWS t3.medium EC2 instance, leveraging Wavelength for ultra-low latency. This EC2 instance is running Windows 2019 Server and Nginx, which allows it to function as an RTMP streaming server. This stream is sent to a g4 EC2 instance, which processes the incoming video feed from the RTMP streaming server with a custom loaded AI model. The AI inference video feed can be accessed by emergency service workers (and other users for other loaded models) via a Django web application hosted on the g4 instance.

Concluding Notes and Future Directions

Participating in this hackathon was a truly informative and entertaining experience. This was our first time using EC2 instances on AWS, and I think that we learned a lot. We also had a ton of fun playing with our Java code to control our drone. We’re excited to see what kinds of problems we (and perhaps others) can solve with what we’ve laid out. Thank you for the opportunity!

LifeHawk isn’t just limited to emergency services. Anyone can upload their own custom deep-learning models to a G4 EC2 instance on the AWS cloud, and take advantage of real-time, state-of-the-art AI models in computer vision running on their drone footage. Businesses can use this to automate land inspection or building maintenance processes by leveraging their own custom AI models running in real-time on the Edge.

Here’s how everything was built. The LifeHawk mobile application is based in Java and developed for Android, relying heavily on the DJI mobile and UX SDKs. Through the RTMP protocol, ultra HD video is sent via Verizon’s 5G network to a t3.medium EC2 instance in a Wavelength Zone. This EC2 instance is running Nginx and serves as an RTMP server. The video is then relayed to a G4 instance which processes the video through deep learning models, and allows users to access the stream from a Verizon 5G device.

Thank you for the opportunity, and we had a great time at the 5G Edge Computing Challenge.

Built With

Share this project: