Originally from Haiti, we understand the tragic impact and have lived through several natural disasters such as hurricanes and earthquakes. We have felt the despair of friends and family , when their home is damaged by a natural disaster and they don't know where to start and who to ask for help. We decided to participate in this hackathon to learn how we can better use AI and ML to solve a humanitarian challenge. Because of our experience we were inspired by the question : How to accurately and efficiently determine the extent of damages to individual homes in a given disaster-impacted area?
Damage assessments during and after natural disasters are core activities for humanitarian organizations. They are used to increase situational awareness during the disaster or to provide actionable information for rescue teams. Post-disasters, they are essential to evaluate the financial impact and the recovery cost. However, effectively implementing damage assessment using surveys represents a challenge for several organizations as it takes weeks and months and is logistically difficult for rapid assessment. In recent years a lot of research has been done on using AI and machine learning to automate classification of the damages.
Inspired by previous work we decided to work on a proof of concept.
What it does
To answer that question our team proposed to train and deploy a proof of concept image classifier for damage assessment. We built a simple app that would allow users to load an image of a building, road or bridge and classify the extend of damage cause by an earthquake. The severity of damage in an image is the extent of physical destruction shown in it. For this experiment we only consider three level of damages :
- severe damage
- mild damage
- no damage (none)
Our demo does not consider aerial and satellite images and does not include location and mapping capabilities. The model deploy was mainly trained on earthquake social media and google images.
You can access and test our app here: disaster-damage-classifier on Hugging face spaces deployed as a Gradio app.
This project is inspired and built on top of existing work from the manuscripts: “Damage Assessment from Social Media Imagery Data During Disasters” and “Automatic Image Filtering on Social Networks Using Deep Learning and Perceptual Hashing During Crises” from ( D. T. Nguyen et al) .
How we built it
We trained the model using social media imagery data using robust and recent pre-trained computer vision neural networks such as EfficientNet. The public data used come from the manuscript which comprises images collected from Twitter during four natural disasters, namely Typhoon Ruby (2014), Nepal Earthquake (2015), Ecuador Earthquake (2016), and Hurricane Matthew (2016). In addition to Twitter images, it contains images collected from Google using queries such as "damage building".
To improve our model we include data from the Haiti Earthquakes 2010 and 2021, and created a cross-event earthquakes dataset using Nepal, Haiti, Google , 30% of Equator data for training and 70% of Equator Earthquake data for validation and testing.
We use Amazon SageMaker Studio and Studio Lab with Tensorflow to build and train the model.
Challenges we ran into
The problem is complex and subjective. Labeling this type of data is not straightforward. The data was not of the best quality and it was imbalanced and incomplete.
Accomplishments that we're proud of
We are proud to create an app that can help to quickly classify an image based on the extent of the damage in that picture.We are proud of our github repo, our experiment report and space gradio app.
What we learned
- Focus on improving your dataset from day 1
- Troubleshooting deep learning code
- Better manage and organize python dependencies
- Tracking experiment using wandb.ai
What's next for Disaster damage assessment from social media image
- Build and deploy an edge-based computer vision solution on smartphones for damage assessment.
- Review and relabeled the data : no damage, affected, minor, major, destroyed
- Collect more earthquakes and quality data.