🌍 PulseAI — Real-Time AI-Driven Disaster Intelligence & Resource Orchestration
🧠 Inspiration
During disasters, information arrives too late, from too many places, and without coordination. Satellite images sit unused, tweets go unread, sensors raise alerts in isolation, and citizens file reports that take hours to validate.
We asked a simple question:
What if AI could continuously listen, reason, validate, and act — all in real time?
PulseAI was built to answer that question.
Our inspiration was to create a living, streaming intelligence system that:
- Understands disasters as they unfold
- Correlates citizen reports, satellite imagery, sensors, and social media
- Automatically decides when deeper analysis is required
- Triggers the right response workflows without human delay
💡 What PulseAI Does
PulseAI is an AI-native disaster response platform built on:
- Flink for real-time streaming intelligence
- ADK (Agent Development Kit) for multi-agent reasoning
- MCP (Model Context Protocol) for secure, tool-driven execution
- Google Cloud (BigQuery, GCS, Cloud Run) for scalable execution
At a high level, PulseAI performs four continuous intelligence loops:
- Citizen Case Analysis
- Satellite Damage Detection
- Sensor Anomaly Detection
- Tweet Classification & Verification
All of these flows ensure:
- Deterministic execution
- No duplicated actions
- Fully auditable AI decisions
🔁 System Architecture Overview

PulseAI is event-driven:
- Every incoming signal (tweet, sensor alert, image upload, citizen case) enters Kafka
- Flink Compute Pools process these streams
- AI Agents reason over the data
- MCP Tools execute real-world actions (queries, jobs, writes)
- Results are persisted back to Kafka and BigQuery
🔁 Core Intelligence Flows
🧍 Citizen Case Analysis Flow

Flow Explanation
- Citizen submits a disaster case
- Case is summarized and pushed into a Kafka topic
- Flink AI Streaming Agent consumes the case
- The agent performs the following:
- Queries tweet alerts via BigQuery MCP
- Queries sensor alerts via BigQuery MCP
- Forecasts required emergency resources
- Fetches available nearby resources
- Runs a custom resource allocation UDF
- The agent produces:
- A complete disaster assessment
- A resource allocation plan
- Results are:
- Written to BigQuery
- Published back to Kafka for dashboards and responders
🛰️ Damage Detection & Analysis Flow

Flow Explanation
- A satellite image is uploaded to Google Cloud Storage
- A file-creation event is published to Pub/Sub
- Kafka ingests the event into Flink
- The AI Streaming Agent:
- Fetches pre-disaster & post-disaster image pairs
- Queries contextual alerts (tweets + sensors)
- The agent evaluates: > Is damage analysis required?
- Only if required, the agent:
- Calls a custom MCP Server
- Triggers the Satellite Damage Detection ADK Agent
- The damage agent:
- Submits a Cloud Run Job
- Performs image analysis
- Summarizes damage severity
- Results are persisted into BigQuery
⚠️ Important Design Choice The MCP tool is fire-and-forget, ensuring:
- No blocking
- No retries
- No duplicate job triggers
📡 Sensor Anomaly Detection Flow

Flow Explanation
- Live sensor data enters Kafka
- Flink runs
ML_DETECT_ANOMALIES - If an anomaly is detected:
- An emergency alert is generated
- If not:
- The event is ignored
This ensures early warning detection without overwhelming responders.
🐦 Tweet Classification Flow

Flow Explanation
- Tweets stream into Kafka
- Flink AI Agent:
- Classifies tweet criticality
- If critical:
- Tweets are summarized
- Added to emergency alerts
- Otherwise:
- Ignored
This prevents misinformation from polluting response decisions.
🤖 ADK Agent Architecture

🔹 pulseai_manager_agent (Supervisor)
Role: Central coordinator
- Routes requests across all domain agents
- Prevents duplicate actions
- Maintains session-level reasoning
🔹 satellite_damage_detection_agent (Sequential)
Role: Deep image analysis
- Receives image URLs
- Triggers Cloud Run job
- Performs damage detection
- Produces summarized results
🔹 damage_summarizer_agent
Role: Converts raw inference into insights
- Fetches inference data
- Produces human-readable summaries
- Stores results in BigQuery
🔹 info_collector_agent
Role: Data ingestion & enrichment
- Collects case metadata
- Uploads assets to GCS
- Prepares inputs for downstream agents
🔹 case_summarizer_agent
Role: Citizen case normalization
- Converts raw citizen input into structured intelligence
🔹 rescue_live_detection_agent
Role: Human presence detection
- Runs detection models for trapped survivors
- Feeds results into rescue prioritization
🛠️ How We Built It
Core Technologies
| Layer | Technology |
|---|---|
| Streaming | Flink |
| Messaging | Confluent Kafka |
| AI Agents | ADK |
| Tooling | MCP |
| Storage | BigQuery, GCS |
| Compute | Cloud Run Jobs |
| ML | BigQuery ML, Flink ML |
| Language | Python |
🧩 Challenges We Solved
- Preventing duplicate AI tool execution
- Making LLMs deterministic inside streams
- Orchestrating ML + UDF + Agents together
- Designing fire-and-forget MCP tools
- Ensuring low-latency disaster decisions
🏆 What We’re Proud Of
- A true streaming AI system, not batch AI
- Deterministic AI reasoning in Flink
- MCP-based real-world action execution
- End-to-end disaster intelligence in seconds
- A design that can scale nationally
📚 What We Learned
- Streaming + Agents is the future of AI systems
- AI must be controlled, not reactive
- Deterministic orchestration beats raw LLM power
- Real-world AI needs strong guardrails
🚀 What’s Next
- Drone & live video feeds
- Multilingual citizen reporting
- Government-grade dashboards
- Open APIs for NGOs
- Predictive disaster modeling
🌟 PulseAI doesn’t just analyze disasters — it acts on them. 🌟
Built With
- adk
- cloudrun
- confluent
- django
- flink
- gcp
- gcs
- gemini
- google-bigquery
- kafka
- mcp
- python
- vertex



Log in or sign up for Devpost to join the conversation.