gathering data for training the model
nvidia tx2 j120
plot map of a loaded magazine cross section
intel aero all wired up
We were asked to build something to help the people in Ferguson. So we did.
What it does
It stops bullets. It sees guns and bombs through walls. It sees people through walls, and under mud, snow, and debris. It tracks active shooters and actively, autonomously gets in the way.
How we built it
Our fundamental building ethos is in delivering an unbiased-by-design lifesaving utility. We're developing the indiscriminate EDNA drone is specifically for deployment by uniformed public safety officers like law enforcement and firefighter, along with school security.
The EDNA was built by integrating several next generation technologies: augmented reality (AR), artificial intelligence (AI), machine learning (ML), sensor fusion, biometric feedback, and a holographic user interface to create a _neuromechanica_l drone tool. We've designed it with a “platform-agnostic” ideal in mind: the EDNA system can be integrated into a number of high-capacity commercial UAV platforms. We currently prefer to build with the Intel Aero platform which supports standard electronics expansions. We decided to replace the Intel RealSense R200 sensor with an Intel RealSense D415 - a high-resolution depth camera that computes 3D depth maps to optimize computer vision (CV). Then we added an*Intel Movidius VPU* (vision processing unit) which enhances the execution of DNNs (deep neuronal networks) and CV at very low power consumption and very high speed.
For improved image processing and a better deep learning performance, we've incorporated the NVIDIA Redtail project's TrailNet DNN technology, which is also environmentally aware, detecting objects and avoiding obstacles. By building it this way, the EDNA functions with autonomous collision avoidance as opposed to automated collision avoidance (the autonomous collision avoidance system “thinks” and takes the spontaneous actions necessary to reach a desired state).
We built the EDNA drone's indiscriminate capabilities by relying on a system of anomaly detection: presence of real gun (yes/no), loaded or unloaded (yes/no), presence of human life (yes/no), etc., (we call it "Predictive Probable Cause"). Through sensor fusion, anomaly detection capabilities are facilitated via an intricate and cooperative integration of numerous partner technologies for an elegant delivery of functionalities combining 3 key elements:
Radio frequency tri-dimensional sensor
Long-range computer vision.
In order to detect AR-15s and 9mm firearms (the most common handguns used in the U.S.), a unique software programming code builds by adding classifications into a neuronal network. We began testing with 45mm, then 40mm and rifle rounds, then 9mm and 22mm. The RF data algorithm is trained on the presence of a metallic firearm component (9mm handgun clip or 9mm handgun). After training a neural network composed of two fully connected layers (each with sigmoid activation functions), we were able to achieve our first prediction accuracy of 79% which was better than we expected and will continue to improve in ongoing development & testing.
The initial training was deliberately limited to the dataset of a .45mm semi-automatic handgun clip scanned at a distance of roughly 4-6 cm from the sensor. The sensor field was restricted to 5° vertically and horizontally with a depth of 1- 10 centimeters, with image scans being performed every 2 degrees, with a resolution of 1 mm. This was done to control for the quantity of ambient data sources inevitably being scanned by the sensor, to ensure the majority would be representative. A second dataset was produced, and the algorithm performs with the ability to correctly identify likely targets.
We combined devices in order to:
determine object characteristics (such as shape and proximity to a human being)
reflect electromagnetic waves off an object positioned behind a structure
to create a planar map and three-dimensional perspective of predictive trajectory tracking.
The interpretations of the RF sensor combine with the multiple sensor streams produced by the Intel RealSense. We began with the relatively standard approach of feeding the streams into an existing Convolutional Neural Network (CNN) selected for performance, accuracy, and efficiency (YOLO, from darknet). Then, we simply transfer trained it as necessary.
Situational awareness to the degree provided by the EDNA has the potential to be game-changing for law enforcement and emergency management. In order to be a game-changer for everyone else, the EDNA should be able to both act upon its hyper-specific conclusions and actively intervene to forcibly de-escalate potential gun violence. In order to give the EDNA UAV a bulletproof body, different materials need to be combined into a single armor.
We've designed armoring from materials which meet the U.S. National Institute of Justice's requirements - materials that are highly resistant, lightweight, and durable. We're currently researching Aluminum metallic foams (AMF), composite Kevlar and polyethylene fibers (both used in bulletproof vests) options to manufacture a “sandwich structure” in which the core material or matrix is the AMF and both back and/or front plate are made out of Kevlar (KV, polyparaphenylene terephthalamide). An additional outstanding armoring option for the EDNA is Stainless steel composite metal foam. We tested and demonstrated that the raw, untreated drone armor dissipates most of the incoming bullet force (even when the armor technically fails, it simply deforms - it does not splinter or shatter). We've continued testing and shown that a fabrication of a KV or Al backplate with the composite metal foam combined with a front layer of B4C ceramic ensures an armor that is absolutely bullet- and blast-resistant, to further eliminate bullet lethality.
Currently the drone is designed to be semi-autonomous, meaning it can be navigated manually via the standard RC controller, and its autonomous behavior is designed to be overridden by a human operator via the RC controller. By default, it is designed to autonomously respond to certain specific criteria: when it detects an anomaly, it will shift into “guardian angel mode,” and “active shooter mode,” which is triggered when a hand (any hand) gets too close to a gun. This provides an intuitive piloting system, one that allows articulate operation with an ultra-fast, minimal learning curve for operator training.
We now use a Brother AirScouter which provides much less invasive AR drone’s eye view visual display of the real time or near real-time sensor output which displays situational and environmental conditions from a safe distance. Therefore, dangerous tasks in potentially hostile or inaccessible environments are performed with no human risk.
Challenges we ran into
The original design worked with a Parrot ARDrone and was primarily focused on developing the advanced piloting system, an endeavor which was successful, but eventually shelved due to user feedback. Among the challenges presented by the ARDrone was packet loss, characterized by data that was sent correctly but fails to arrive to its destination. In the case of the ARDrone, this is a known issue caused by insufficient hardware capabilities: the AR console does not manage its extremely limited resources particularly well. Due to these hardware limitations, we ended up simply using an intermediary computer in lieu of a more robust companion computer.
Eventually, after evaluating numerous drone platforms, we chose to simply use the Intel Aero, which was designed to be friendly for active development.
We originally developed a sophisticated piloting mechanism involving an Emotiv Insight EEG sensor and a Microsoft HoloLens, which were mapped together via the HoloLens to override the drone’s navigational controls and manual piloting interface (the hand-held RC controller.) However, while this technology proved relatively straightforward in terms of implementation, feedback from early focus groups suggested that it was simply too advanced for the majority of users. As such, it was tabled from the overall EDNA project, where it remains.
Accomplishments that we're proud of
As evidence of our unbiased-by-design lifesaving EDNA's popularity in the court of public opinion, we've been endorsed by the American College of Emergency Physicians, Black Lives Matter, and the Fraternal Order of Police.
We had a paper accepted by IEEE and will be presenting at the Global Humanitarian Technology Conference in October in San Jose. We'll also be panelling at the Harvard's Intercollegiate Business Conference in Boston as well as BIFrance in Paris - also in October.
In March 2018 we won the Audience Choice Award at the Google HQ's Women Startup Challenge: Emergent Tech hosted by Women Who Tech in New York City.
What we learned
Nobody else has integrated our ideas for combining technologies on an unmanned aerial vehicle, so we filed a patent on nine inventions.
The hardware components are expensive to develop with - and require ongoing fundraising for R & D. However the EDNA in deployment is actually very cost-effective equipment for cities across the county - as our many agency requests for this tool can attest.
Though we've documented solid computational science R & D, and a filed patent - many people just want to see the EDNA in action.
What's next for EDNA - Astral AR
We're currently working toward our first release and launching between Q1 and Q3 2019 to 16 public safety agencies and 19 schools through the central Texas region.