Our high school orchestra director's family in the Bahamas was impacted by Hurricane Dorian, because of this we were inspired as software engineers to create a project which would aid emergency services in efficiently searching for victims in a natural disaster.

What it does

Flare is React Native application incorporated into drone technologies that analyzes the victim density of an image quadrant from a bird's eye view using deep learning. The drone would take a picture of a quadrant and send that image to the Python Flask server where the image is processed into our deep learning model which classifies the victim density level. The victim density level is pinged on the drone's location and is able to be viewed through the Flare client by integrating the Google Maps API.

How we built it

We built the client-side of Flare using React Native (Javascript) and the server with Python Flask. Our deep learning model was constructed using Tensorflow on a convolutional neural network - the validation accuracy was approximately 72%.

Challenges we ran into

The biggest challenge for Flare was trying not to over-fit/under-fit our deep learning model to get accurate results.

Accomplishments that we're proud of

We learned a lot about properly fitting a deep learning model.

What we learned

We learned how to create cross platform mobile application using React Native.

What's next for Flare

The next thing for Flare is to get a larger data set, create more victim identification models, and enhance the UI/UX design by following material design.

Built With

Share this project: