Inspiration

The inspiration for DisasterIQ originated from the limitations of traditional predictive models in the insurance industry, which rely heavily on historical, 2D data. These models often struggle to adapt to the volatility caused by real-time disasters, macroeconomic shifts, and market dynamics. As a result, insurers face inefficiencies in risk management, pricing, and forecasting. Our goal was to harness the power of Multi-Agent AI to overcome these challenges by integrating real-time data and market insights, offering insurers a way to optimize risk management dynamically and make data-driven decisions faster and more accurately.

What it does

DisasterIQ is a real-time, Multi-Agent AI system that enhances disaster response, risk optimization, and pricing strategies for the insurance industry. Our system processes real-time disaster data, economic trends, and market signals to predict spikes in demand, detect anomalies, and optimize pricing strategies. It uses ML models and mathematical reasoning to analyze patterns, forecast risks, generate actionable insights. The AI also includes an explanatory layer that provides human-readable reports on the risks, anomalies, and strategic recommendations, making it easier for decision-makers to take immediate action.

How we built it

DisasterIQ was developed using a modular, four-agent architecture to address the complexity of real-time data and market analysis:

  • Risk & Anomaly Detection Agent: This agent uses algorithms (we are thinking about LSTM and ARIMA) to detect anomalies in claims data, correlating them with disaster events and market changes.
  • Market Insights AI: By pulling real-time disaster data from sources like tomorrow.io API, OpenFEMA, and financial trends, this agent helps the system understand market dynamics and policy shifts. Sentiment analysis from social media and news outlets further refines the predictions.
  • Demand Forecasting & Risk Optimization Agent: Using models like XGBoost, Prophet, and Monte Carlo Simulation, this agent predicts future claim demand and suggests pricing adjustments based on anticipated changes in risk.
  • AI Explanatory Agent: This agent converts complex model outputs into easily understandable insights using models like GPT-4. It generates reports on how different factors, such as a hurricane or an economic downturn, affect insurance claims.

The dashboard frontend was built using Streamlit, with data visualizations powered by Plotly. The backend communicates with a FastAPI microservice to fetch data dynamically. This allows the system to always be up-to-date with the latest market and disaster information. Our approach starts with the most simplified version of the model and gradually builds upon it, refining insights and making them more actionable for insurers. We begin by analyzing inflation over the last six months—if it has increased over this period, it serves as a basic indicator that insurance prices may rise. This is the simplest possible insight we can extract, directly linking macroeconomic trends to insurance pricing. From there, we introduce weather indicators, such as wind speeds or precipitation, which could signal an approaching storm. If a storm is coming, insurers may anticipate higher claim payouts, leading to premium adjustments. At this stage, we’re still keeping things simple, looking at economic and weather indicators separately.

Next, we expand beyond a single metric in both areas. Instead of just inflation, we also look at GDP, CPI, and interest rates to get a more complete picture of economic risk. On the weather side, we go beyond storms to consider hurricanes, wildfires, and other extreme events that could drive up claims. Once we have this broader dataset, we stop looking at indicators in isolation and start analyzing relationships between them. For example, if a storm is coming while inflation is rising, home insurance prices may increase in high-risk areas. If wildfire risks are high at the same time an economic downturn is unfolding, insurers may need to prepare for increased claims and higher fraud risk. This is where we shift from looking at individual signals to generating multi-factor insights that help insurers make better risk assessments.

With these structured insights, we then transition into predictive modeling by encoding risk indicators into numerical values. This allows us to build AI-driven models that detect anomalies, forecast future risks, and classify areas as low, medium, or high risk. The ultimate goal is to develop a risk score that combines all these insights into a single, easy-to-interpret metric. This score would help insurers adjust pricing dynamically, identify high-risk regions in advance, and improve underwriting decisions. By following this structured, step-by-step approach, we evolve from basic trend analysis to a powerful predictive model that has real-world applications for insurance pricing, claims assessment, and risk management.

Challenges we ran into

One of the major challenges we faced was ensuring that the diverse datasets we used were compatible. The datasets we sourced, which ranged from insurance claims data (with monthly values) to disaster events (with daily data), had different timeframes. Making these datasets compatible with each other was important for the system to generate accurate predictions.

Another challenge that we faced was trying to implement all of our proposed ideas within the limited time span that we were given in this hackathon. Our brainstorming session took us a long time to perfect the project's objectives and milestones.

Accomplishments that we're proud of

Our dashboard, developed using Streamlit and Plotly, allows users to interact with the data and visualizations in an intuitive way, providing them with up-to-the-minute insights into risks, anomalies, and inventory statuses.

One of the key accomplishments is the integration of real-time macroeconomic and weather data through dedicated API modules. The macro_api.py module connects to the FRED API to fetch real-time inflation data (CPI), analyzing trends over the past 36 months and computing net inflation changes over the last six months. This allows for forecasting potential impacts on insurance pricing, complete with dynamic visualizations that enhance risk and pricing assessments. Simultaneously, the weather_api.py module retrieves historical weather data, including temperature, precipitation, and windspeed, from the Open-Meteo API. This data is processed and structured for disaster-related risk analysis, which will make sure that users gain comprehensive insights into environmental risk factors.

Additionally, we established a single-agent AI risk score methodology as a foundational step toward our ultimate goal of a Multi-Agent AI system. Each AI agent is tested independently to ensure precise risk scoring based on macroeconomic indicators and weather patterns. This modular approach allows for accurate evaluation of individual risk factors and ensures that future integration of multiple agents will be seamless and scalable.

What we learned

We think the biggest learning we had was to break an idea or a problem, so to speak, in the most basic atomic unit and build up layer by layer. While this may or may not be effective in a hackathon setting, we believe that this takeaway is one of the most important ones we take that could potentially impact our approach to both technical and non technical problems we partake in the future. Having experienced being blocked by complexity of an idea, we learnt that break down the complexity into smaller units and building back up is really helpful to make tangible progress.

What's next for DisasterIQ

Our struggle with implementation really made us think in way described above. So, while we couldn't get to implementing the idea we decided to tackle, we charted a very clear approach and methodology to build this project in a layer-over-layer format described above. The approach is shared above in the "How we built it" section that we plan to implement together after this hackathon because we think this is a great business use case or pain point to tackle.

Built With

Share this project:

Updates