Inspiration

I’ve always been fascinated by how history repeats itself, especially when it comes to climate and natural disasters. During this hackathon’s “Decode the Past” challenge, I wanted to explore how historical data can teach us resilience. The idea behind ReMemory was simple but powerful: “What if students and communities could see the stories behind past disasters, and use that knowledge to prepare for the future?” This project reimagines how we interact with historical disaster datasets, not just as statistics, but as memory-based lessons for future safety.

What it does

ReMemory lets users:

Select a disaster type (flood, drought, hurricane, wildfire, earthquake).

Choose any country from EM-DAT’s global disaster records (2000–2025).

View visualized impact trends , showing how many people were affected each year.

Read AI-generated Preparedness & Recovery Briefs tailored to that country.

Access universal response contacts and relief programs, like WHO and IFRC, all in one dashboard. It bridges historical data + AI education, helping users learn how to prepare, who to contact, and why data matters.

How we built it

The core app is built with Streamlit, using Python, Pandas, Plotly, and the OpenAI API. Here’s how it all connects: 🧮 Data Pipeline: I cleaned and normalized disaster CSV files from EM-DAT (2000–2025). 📊 Visualization: Plotly Express powers dynamic line charts showing impact trends per year. 🤖 AI Component: Using GPT-4o-mini, ReMemory generates contextual disaster preparedness plans ; from risk assessment to home safety. 🌐 Interface: Streamlit provides a clean sidebar-based UI with dropdowns for disaster type, dataset, and location. 💾 Fallback Mode: If AI isn’t available, ReMemory automatically uses a built-in structured guide for disaster education.

Challenges & What I Learned

Challenge: Cleaning inconsistent data across 5 disaster types with different schemas. Lesson: I learned how to build a flexible loader that normalizes columns dynamically , “country,” “year,” and “impact” , regardless of file naming.

Challenge: Streamlit deployment initially failed due to secret-key detection. Lesson: I now understand secure environment handling with .env files and Streamlit Secrets (TOML formatting!).

Challenge: Generating long AI responses without truncation. Lesson: Adjusting token limits and structured prompting made the preparedness output clear and complete. Overall, I learned how to merge data visualization, AI, and civic education into one cohesive app , a skill I’ll carry into every future project.

Impact

ReMemory reframes “history” not as something we read , but something we prepare from. By using AI to decode past disasters, the app helps users understand what went wrong, what improved, and how preparedness saves lives.

It’s a small but scalable step toward AI-assisted environmental education and disaster resilience.

Built With

  • em-datdataset
  • openaiapi
  • pandas
  • plotlyexpress
  • python3.11
  • streamlit
Share this project:

Updates