DEMO: https://noahs-ark.dev/

PowerPoint: Click Here!

Inspiration

Two of our team members battled through Hurricane Irma as it ravaged the Florida coast in 2017. Another suffered through a week of power loss after Hurricane Sandy destroyed the New England coast. Hurricanes and the ensuing floods have always been a part of our team’s lives. In 2020, Category 4 Hurricane Laura itself caused nearly $9 billion dollars in damage. In addition, we believe that the prevalence of these hurricanes will become more and more of a reality as climate change increases water levels and water temperature, making more cities, and thus more people, susceptible to floods. This means that insurance companies will have to start offering flood insurance to areas where they have no prior hurricane or flood data. We realized building that could provide resilience for citizens while also serving as a better insurance tool for insurance companies could help improve outcomes from hurricanes and floods. So, we built Noah’s Ark.

What it does

Noah’s Ark is a tool for homeowners and insurance folks that uses Deep Convolutional Autoencoders on hurricane flood image data to determine flooding risks of areas. Presented as an overlay over google maps, Noah’s Ark provides users with the ability to see exactly which areas, streets, canals, rivers, etc. are more prone to flooding. On top of image data, we used elevation data in the final determination of the risk of flooding for an area, and also provided real time data on storm systems over the U.S. Users can freely explore the world by dragging the map around and searching for places as well.

How we built it

We used a variety of frameworks and technologies to build Noah’s Ark. One of the primary features of the project was Google Cloud, specifically the Google Maps Javascript API. On top of the simple maps interface that the Javascript API offers, we also implemented weather data overlays from Iowa State University's Open Mesonet (geospatial data), elevation data, and geolocation data (which allows us to convert user input for address into latitude and longitude coordinates).

These latitude and longitude coordinates are then used to collect google maps satellite data of those locations, which is then fed into a Deep Convolutional Autoencoder that was trained to predict what the flooded area would look like. The datasets we used to train the model comes from NOAA’s satellite imagery of hurricanes such as hurricane Harvey and Irma.

The autoencoder used to predict the flooded images is divided into two parts, an encoder and a decoder. The encoder consists of two convolutional and max pooling layers, and a fully connected layer. The decoder is the inverse of that, beginning with a fully connected layer and ending with two convolutional and max pooling layers. This inverse relationship ensures that the input and output are the same shape, while allowing for transformations to still occur within the image. We modeled this autoencoder in Keras/Tensorflow and trained it on the NOAA satellite imagery. Once we receive the output of the model, we overlay the image over the corresponding coordinates on the google map, where darker pixels indicate places that could flood. Elevation data was used with this to implement a relatively simple risk score ranging from no risk to danger. The web application was hosted on AWS and served by a Python Flask server. The front end was implemented with the Materialize CSS library and used vanilla javascript as well as a bit of jQuery.

Challenges we ran into

Training an effective autoencoder in a limited time was quite difficult, especially when the data we had was tens of gigabytes of images. We weren’t able to use all of it for training, but could definitely improve the model with more time and data.

Accomplishments that we’re proud of

Being able to line up satellite data from NOAA with data in Google Maps. Building a successful autoencoder for multi-colored images. Being able to use the ML model to create a Google Maps layer which we can place on a map

What we learned

We had never used autoencoders in practice and it was eye-opening to learn how they worked and how to build them. We also learned a lot about how Google Maps layers work and how we can layer several at a time.

What's next for Noah's Ark

Improve the models. This could be through training the autoencoder on more data, or designing a more efficient neural network through further research Expand the capabilities of the neural network. One idea we had that we’d love to implement in the future is to help insurance companies predict insurance rates based on flooding risk of a house or business.

Share this project:

Updates