CrisisFlow — Google ADK pipeline (project story)
Inspiration
Every major disaster raises the same question: who needs what, and how fast can supplies reach them? Coordinators still stitch together feeds, rough estimates, and phone calls to hubs while the first 72 hours pass.
We focused on an ADK-driven workflow: for disasters already stored in a shared database, an agent graph pulls grounded weather, impact research, and logistics context (Sphere-style needs, UNHRD stock, OCHA funding), then writes structured enrichment back so operators see one coherent picture—not only model prose in a chat pane.
What it does
CrisisFlow’s Google ADK app (crisisflow/agent.py) is a SequentialAgent (CrisisFlowDisasterAnalyst) that runs on-demand enrichment (e.g. via adk web):
DisasterDataPrepAgent— Uses FastMCP tools to load events from Snowflake (query_disaster_events) and compound-risk context (get_nearby_disasters), then emits structuredevent_dataJSON with exact event ids so laterUPDATEs hit real rows.ParallelAgent— ThreeLlmAgentsub-agents in parallel:WeatherContextAgent— Open-Meteo forecast + history via MCP →weather_analysis.ImpactAnalysisAgent— Google Search for population, infrastructure, precedent, news →impact_analysis.AidContextAgent— MCP:get_depot_inventory,get_nearest_depots,calculate_sphere_needs,get_ocha_funding→aid_context.
LoopAgent(risk assessment, max 3 iterations) —RiskValidatorcallsvalidate_risk_assessment(programmatic checks: impact present, plausible populations, severity alignment; weather optional when impact exists).RiskCorrectorcan refine failures with search, thenexit_loop.EnrichmentPersistAgent— Non-LLM step: runssave_disaster_enrichment, merging session analyses into Snowflake (allocation,agent_reasoning,operational_summary,enrichment_status, …) and, when configured, committing hub stock (optionally via FlaskPOST /inventory/commitso dashboard inventory matches the ADK process).BriefingAgent— Produces a long CRISISFLOW DISASTER INTELLIGENCE BRIEFING, callspersist_intelligence_briefing(full text →ENRICHMENT_SUMMARYfor the UI), thensend_a2a_briefingfor optional A2A handoff.
The React + Vite dashboard and Flask API read the same Snowflake data (and live inventory when synced), so enrichment, aid breakdowns, and briefing summaries appear in the app—not only in ADK Web.
How we built it
| Area | Role |
|---|---|
| Python 3.13 + Google ADK + Uvicorn | adk web, runners, agent graph |
crisisflow/agent.py |
SequentialAgent → data prep → parallel analysis → risk loop → persist → briefing |
crisisflow/tools.py |
validate_risk_assessment, save_disaster_enrichment, persist_intelligence_briefing, send_a2a_briefing |
crisisflow_mcp/server.py |
Snowflake queries, compound risk, Open-Meteo, depot inventory/routing, Sphere needs, OCHA FTS, … |
gemini_adk_plugin.py |
Key rotation over GEMINI_KEYS, asyncio.Lock to serialize LLM calls under ParallelAgent, model fallback gemini-2.5-flash |
Built With
- a2a-sdk
- agent-to-agent-protocol-(a2a)
- fastmcp
- flask
- flask-cors
- gdacs-api
- google-adk
- google-gemini
- google-search-api
- javascript
- model-context-protocol-(mcp)
- nasa-eonet-api
- noaa-national-weather-service-api
- ocha-financial-tracking-service-api
- open-meteo-api
- python
- react
- react-globe.gl
- snowflake
- snowflake-connector-python
- sphere-handbook-2018
- starlette
- three.js
- unhrd
- usgs-fdsnws-api
- uvicorn
- vite

Log in or sign up for Devpost to join the conversation.