Inspiration
The 2025 monsoon floods in Pakistan affected nearly 7 million people, yet many fatalities occurred simply because warnings didn't reach the right people in a format they understood. We noticed a fragmentation gap; the data exists within agencies like the NDMA and PMD, but it’s buried in multi-page PDFs and technical jargon. Now, the monsoon season this year is expected to be significantly wetter. So, inspired by international alerting systems, we are trying to build an intelligent system that makes it easier for locals to access and act upon information.
What it does
Realtime Emergency Alert Collection Hub is an automated "Collection → Processing → Distribution" pipeline.
- Continuous Scrape: Every 10 minutes, our system scans official government sources for new reports.
- AI Interpretation: Using a multimodal model to read all information from test, maps and charts, we transform the collected documents into a Common Alerting Protocol (CAP)-inspired structure in under a minute that is easily interpretable by laymen and is compliant with other systems.
- Interactive Visualization: Alerts are plotted on a high-fidelity map dashboard where users can filter by disaster type, search alerts their location, and view distilled information instead of long technical documents.
How we built it
We built a robust, cloud-native stack designed for speed and scale:
- The Brain: We utilized Gemini 3 Flash in reasoning mode with few-shot prompting to handle the messy, inconsistent formats of official reports.
- The Pipeline: A BeautifulSoup scraper for data extraction, a PostGIS-based custom geocoder geocoding place names to polygons and then feeding into a PostgreSQL database for storage.
- The Interface: A beautiful frontend built with React.js and Mapbox.
- Infrastructure: Hosted on a modular architecture using Supabase, Render and Modal to ensure the scrapers and AI workers don't bottleneck each other.
Challenges we ran into
JSON structure
An issue we faced was the model not producing the desired JSON even after extensive prompt engineering. We moved on to a few-shot prompting approach by giving the model examples of documents perfectly transformed to JSON. This drastically improved results.
Geocoding
We faced a lot issues where compositional descriptions of regions were given in the documents. We used a heuristics based approach to solve this for now. For examples, we asked the model to transform something like "areas downstream of Tarbela Dam" to the district containing the dam, utilizing the model's world knowledge. For directional descriptions like "Northern Sindh", we used bounding box intersections to extract the desired districts from the provinces area.
Performance
Storing the information in a normalized form lead to a lot of joins and repetitive processing. We fixed that by denormalizing and simplifying data into a read model with indexes for blazing fast performance.
Accomplishments that we're proud of
We successfully reduced the "Information-to-Action" window. Seeing the system autonomously scrape a technical NDMA bulletin and, within minutes, render a clean, alert icon over a specific tehsil on our map was a eureka moment. We’ve managed to create a system that is independent of telecom provider restrictions, making it accessible to anyone with a data connection.
What we learned
We learned that in disaster management, UX is a safety feature. A pretty map isn't just for show, it reduces the cognitive load on a panicked user. We also learned that multimodal AI (Vision + Text) is no longer optional for government-tech; it is the only way to process the sheer volume of legacy document formats still in use.
What's next for REACH
This prototype is just the beginning. Our next steps include:
- Push notifications: Geo-fenced notifications for all our apps
- Deduplication: Logic to merge duplicate information appearing across agencies
- Mobile apps: Native apps for mobile platforms to be able to reach(pun intended) the most amount of people
- Localization: Translation into local languages so all our intended users are able to use the app
- QA agent: An agent capable of fetching data to answer natural language questions and produce visualizations if needed
Built With
- beautiful-soup
- fast-api
- gemini
- git
- google-ai-studio
- mapbox
- modal
- netlify
- postgis
- postgresql
- python
- react
- supabase


Log in or sign up for Devpost to join the conversation.