Inspiration
We were inspired by the ongoing conflict between Iran and Israel, where the "Survival Gap" turns smartphones into "digital bricks" as infrastructure collapses. Seeing families trapped in "Dark Zones" without maps or communication sparked our mission to build a decentralized, self-healing guardian. We believe that even when the world goes dark, the human right to navigation and safety must remain intact.
What it does
CrisisNav AI is a resilient, offline-first survival guardian. It transforms chaotic environmental data into simple, three-word survival commands. Using a combination of real-time geospatial intelligence, vision-based obstacle detection, and a peer-to-peer Bluetooth mesh network, it guides users—including the visually impaired and non-native speakers—to safety when the internet and power grids are dark.
How we built it
The core intelligence is powered by Gemini 3.1 Flash, which distills complex crisis inputs into actionable cues. We integrated Google Lens for real-time visual re-routing and used the Nearby Messages API to create a decentralized mesh network. The UI was built for "Zero-Type" interaction, utilizing Voice-First processing and Haptic Feedback vibration patterns to ensure the app remains accessible to users with disabilities or those in high-stress behavioural states.
Challenges we ran into
Our biggest challenge was "Intelligence at the Edge"—maintaining high-level AI reasoning without a cloud connection. Additionally, designing a UI that remains user friendly legible while conveying urgent, high-stakes information required rigorous UX testing for high-contrast accessibility
Accomplishments that we're proud of
We are incredibly proud of building a truly Universal Design tool. Successfully implementing a haptic navigation system for the visually impaired and a multilingual translation layer for refugees means our app serves the most vulnerable 15% of the population, who are often left behind by traditional tech. Seeing our "Offline Mesh" successfully sync nodes without Wi-Fi was a major technical milestone.
What we learned
We learned that in a crisis, "less is more." Technical sophistication must be hidden behind a simple, instinct-driven interface. We gained deep insights into Agentic AI workflows, the physics of mesh networking, and the importance of Inclusive Design—realizing that building for extreme edge cases (like war zones or blindness) actually creates a better, more resilient product for everyone.
What's next for CrisisNav AI
We plan to partner with local NGOs to field-test the mesh network in high-risk areas. Ultimately, our goal is to move beyond a mobile app and evolve CrisisNav AI into a global, self-healing rescue infrastructure that ensures no one is ever truly alone in their darkest hour.
Built With
- ai
- firebase-(auth-&-firestore)
- gemini-3.1-flash-api
- google-cloud-functions
- google-lens-sdk
- google-maps-platform
- nearby-messages-api-(bluetooth-mesh)
- node.js
- translategemma
Log in or sign up for Devpost to join the conversation.