Inspiration
I was inspired by the increasing frequency and severity of natural disasters and the need for more effective predictive tools to save lives and reduce economic losses.
What it does
The project uses AI and ML to analyze historical data, providing predictions on fatalities and economic losses from natural disasters. It integrates data from Our World in Data, generates actionable insights for disaster preparedness and response.
How I built it
I utilized Python for data processing and model development, leveraging libraries such as Pandas, Scikit-learn, and TensorFlow. The dataset was sourced from public global natural disaster records, and I used cloud computing for scalable data processing and model training. Additionally, I deployed the application on Streamlit, which provides an engaging, interactive user experience.
Challenges I ran into
One of the main challenges was handling the large dataset. I overcame this by implementing efficient data preprocessing techniques and using cloud-based resources for scalable computation.
Accomplishments that I am proud of
The application provides visualized data to show users over the years natural disasters effects in different countries, giving them the opportunity to raise awareness of the risk of natural disasters.
What we learned
I learned the importance of integrating data sources and the power of AI and ML in transforming raw data into actionable insights. This project also reinforced the value of interdisciplinary subject area collaboration between AI/ML/DS.
What's next for Natural Disaster Risk Management with AI/ML
I plan to enhance the models by incorporating more real-time data sources and improving the algorithms for greater accuracy. Additionally, I aim to collaborate with disaster management agencies to implement the solution in real-world scenarios.
Built With
- colab
- pandas
- python
- scikit-learn
- streamlit
Log in or sign up for Devpost to join the conversation.