Inspiration
The inspiration for Pulse came from a critical question: "How do I get life-saving information to people when the systems I rely on most—internet and power—suddenly fail?" I was driven to create a platform that wasn't just intelligent, but fundamentally resilient. The goal was to build a tool that could keep a community connected and informed during its most vulnerable moments, ensuring that help and information are always within reach, even when offline.
What it does
Pulse is an AI-powered emergency intelligence platform, delivered through a mobile-first Android app and a companion web dashboard.
At its core, Pulse provides real-time incident reporting and visualization. Users can ask the AI assistant natural language questions like, "What fires were reported near me yesterday?" and get immediate, context-aware answers.
Its most critical feature is the offline BLE mesh network. When cellular service is down, the Android app automatically switches to this mode, allowing nearby users to share critical alerts and incident data directly with each other, peer-to-peer. This ensures an unbroken flow of information within a local area.
All of this is powered by a sophisticated backend that leverages the speed and flexibility of Elasticsearch for everything from geospatial queries and real-time analytics to advanced hybrid (keyword + semantic) search.
How I built it
Pulse was built with a mobile-first philosophy, prioritizing the in-field experience of a first responder or an affected citizen.
The App (React Native): The Android application was built using React Native, which allowed for rapid development of a cross-platform user interface. This is where the offline BLE mesh network logic lives, using native Bluetooth modules to broadcast and receive incident data when the app detects it has no internet connection.
The Brains (Python, AI & Elasticsearch): The backend is a Python Flask API that serves as the central hub. I integrated Elasticsearch deeply into the architecture, designing custom indices to handle the complex demands of geospatial data, vector embeddings for semantic search, and real-time aggregations for analytics. On top of this, I layered Google's Gemini 2.5 Flash model to power the chat assistant, NLP search translation, and intelligent data extraction.
Seamless Cloud Deployment: From the beginning, the project was designed for easy and scalable deployment. The backend API is containerized with Docker and deployed as a serverless application on Google Cloud Run. This allows it to scale automatically with demand, from zero to thousands of requests, without any server management. The web frontend was built as a static site and is hosted on Firebase Hosting, providing a global CDN, and an incredibly simple deployment workflow that's perfectly integrated with the Google Cloud ecosystem.
Challenges I ran into
The biggest technical challenge was undoubtedly the offline BLE mesh network. Bluetooth Low Energy has a tiny data payload limit, which is far too small for a JSON object describing an incident. To solve this, I had to engineer a custom data fragmentation protocol that splits the data into uniquely identifiable chunks before broadcasting and reliably reassembles them on receiving devices.
Another significant challenge was the integration of multiple advanced systems. Making Elasticsearch, Google's AI, and the React Native frontend communicate flawlessly required careful data modeling, robust error handling, and a deep understanding of the entire stack.
Accomplishments that I'm proud of
I am incredibly proud of creating a functional and resilient offline BLE mesh network that solves a real-world communication problem. Demonstrating advanced Elasticsearch features in a single, production-ready application was a major achievement, showcasing its power beyond simple text search.
Finally, I am proud of the seamless and modern deployment architecture. The integration with Google Cloud Run and Firebase means the entire platform is scalable, secure, and easy to maintain, allowing me to focus on features rather than infrastructure.
What I learned
This project was a masterclass in full-stack, resilient system design. I learned how to architect and implement an advanced search system with Elasticsearch, apply generative AI to practical, real-world problems, and engineer low-level mobile networking solutions with Bluetooth. Most importantly, I learned how to tie all of these technologies together into a cohesive, production-ready platform deployed on a modern, serverless cloud stack.
What's next for Pulse
The future of Pulse is focused on making it even more intelligent and accessible. A top priority is expanding my reach by developing an iOS version of the mobile app to ensure both iPhone and Android users can stay connected during a crisis. I also plan to leverage more of Elasticsearch's ML capabilities for anomaly detection to automatically flag unusual incident patterns and enhance the mobile experience with features like push notifications, and deeper integration with emergency services.
Built With
- bluetooth-low-energy-(ble)
- docker
- elasticsearch
- firebase
- flask
- google-cloud-run
- google-cloud-vertex-ai
- react-native
Log in or sign up for Devpost to join the conversation.