CrisisFlow — Google ADK pipeline (project story)

Inspiration

Every major disaster raises the same question: who needs what, and how fast can supplies reach them? Coordinators still stitch together feeds, rough estimates, and phone calls to hubs while the first 72 hours pass.

We focused on an ADK-driven workflow: for disasters already stored in a shared database, an agent graph pulls grounded weather, impact research, and logistics context (Sphere-style needs, UNHRD stock, OCHA funding), then writes structured enrichment back so operators see one coherent picture—not only model prose in a chat pane.


What it does

CrisisFlow’s Google ADK app (crisisflow/agent.py) is a SequentialAgent (CrisisFlowDisasterAnalyst) that runs on-demand enrichment (e.g. via adk web):

  1. DisasterDataPrepAgent — Uses FastMCP tools to load events from Snowflake (query_disaster_events) and compound-risk context (get_nearby_disasters), then emits structured event_data JSON with exact event ids so later UPDATEs hit real rows.

  2. ParallelAgent — Three LlmAgent sub-agents in parallel:

    • WeatherContextAgent — Open-Meteo forecast + history via MCP → weather_analysis.
    • ImpactAnalysisAgentGoogle Search for population, infrastructure, precedent, news → impact_analysis.
    • AidContextAgent — MCP: get_depot_inventory, get_nearest_depots, calculate_sphere_needs, get_ocha_fundingaid_context.
  3. LoopAgent (risk assessment, max 3 iterations)RiskValidator calls validate_risk_assessment (programmatic checks: impact present, plausible populations, severity alignment; weather optional when impact exists). RiskCorrector can refine failures with search, then exit_loop.

  4. EnrichmentPersistAgentNon-LLM step: runs save_disaster_enrichment, merging session analyses into Snowflake (allocation, agent_reasoning, operational_summary, enrichment_status, …) and, when configured, committing hub stock (optionally via Flask POST /inventory/commit so dashboard inventory matches the ADK process).

  5. BriefingAgent — Produces a long CRISISFLOW DISASTER INTELLIGENCE BRIEFING, calls persist_intelligence_briefing (full text → ENRICHMENT_SUMMARY for the UI), then send_a2a_briefing for optional A2A handoff.

The React + Vite dashboard and Flask API read the same Snowflake data (and live inventory when synced), so enrichment, aid breakdowns, and briefing summaries appear in the app—not only in ADK Web.


How we built it

Area Role
Python 3.13 + Google ADK + Uvicorn adk web, runners, agent graph
crisisflow/agent.py SequentialAgent → data prep → parallel analysis → risk loop → persist → briefing
crisisflow/tools.py validate_risk_assessment, save_disaster_enrichment, persist_intelligence_briefing, send_a2a_briefing
crisisflow_mcp/server.py Snowflake queries, compound risk, Open-Meteo, depot inventory/routing, Sphere needs, OCHA FTS, …
gemini_adk_plugin.py Key rotation over GEMINI_KEYS, asyncio.Lock to serialize LLM calls under ParallelAgent, model fallback gemini-2.5-flash

Built With

  • a2a-sdk
  • agent-to-agent-protocol-(a2a)
  • fastmcp
  • flask
  • flask-cors
  • gdacs-api
  • google-adk
  • google-gemini
  • google-search-api
  • javascript
  • model-context-protocol-(mcp)
  • nasa-eonet-api
  • noaa-national-weather-service-api
  • ocha-financial-tracking-service-api
  • open-meteo-api
  • python
  • react
  • react-globe.gl
  • snowflake
  • snowflake-connector-python
  • sphere-handbook-2018
  • starlette
  • three.js
  • unhrd
  • usgs-fdsnws-api
  • uvicorn
  • vite
Share this project:

Updates