Inspiration for the project-synape-smart came from a moment of clarity about how disconnected our digital system still are from the way human intelligience naturally flows. Today cities are broken into silos and we turn it into a body, emergencies happens and we need to be ready when it happes.

What it does Project Synapse is an Autonomous Urban Operating System that breaks down city department silos during emergencies.

When a disaster occurs (fire, earthquake, accident), our multi‑agent AI instantly coordinates:

· 🚦 Traffic – Creates green light corridors for emergency vehicles · ⚡ Power Grid – Diverts surplus energy to hospitals and evacuation centers · 🧑‍⚕️ Emergency Services – Dispatches nearest units dynamically

The system has three core behaviours:

· Sense – Ingests real‑time IoT data (simulated but structure‑ready for real sensors) · Decide – Uses a Gemini‑powered Crisis Supervisor to choose optimal actions · Act – Outputs JSON commands that could directly control traffic lights or grid switches

Crucially, it includes an ethical guardrail – it will never depower a hospital, even if the AI suggests it. And it has a self‑correcting loop: after the emergency ends, Synapse automatically reverts all overrides.

How we built it Layer Technology What it does

Frontend Next.js 14, TypeScript, Tailwind CSS, Mapbox GL JS Real‑time digital twin map + reasoning log panel Backend FastAPI (Python), WebSockets Handles city state, simulates IoT events, streams decisions AI Brain LangGraph + Gemini 2.0 Flash (Google AI Studio) Orchestrates agents; Gemini generates tactical decisions State Management LangGraph StateGraph with memory Preserves emergency state across agent handoffs Simulation Python Faker + asyncio Generates 100 virtual sensors (traffic, grid, weather) in real time Deployment Render (backend) + Vercel (frontend) Live demo accessible via link

Building workflow:

  1. Simulated IoT data stream
  2. Built single‑agent Gemini caller
  3. Wrapped it in LangGraph with state
  4. Added second agent (traffic) and conditional edges
  5. Built frontend map + WebSocket connection

Challenges we ran into ⚠️ Challenges We Ran Into

Challenge How we solved it Gemini returning malformed JSON Added a fallback parser and retry logic with json.loads(clean_response) WebSocket connections dropping under load Implemented reconnection with exponential backoff in the frontend LangGraph state becoming huge Trimmed reasoning_log to last 20 entries and stored only diffs Google Maps API key security Created separate projects for Maps (public) vs Gemini (private) – and restricted by HTTP referrer Simulating realistic emergencies Built a weighted random generator: 60% accidents, 25% fires, 15% earthquakes Judges wanting to see AI "thought process" Added a scrolling ReasoningLog component that shows every decision step in plain English

Accomplishments that we're proud of Key Successes & Proud Moments

Uninterrupted Autonomy: The system successfully managed sequential emergencies for hours without any manual intervention. Robust Ethical Boundaries: Our LangGraph-based safety interrupts effectively prevented intentional attempts to cut power to hospitals. High-Speed Performance: Achieved sub-800ms response times from initial trigger to dashboard visualization, even on free-tier infrastructure. Architectural Flexibility: The modular design allows for easy integration of real-world city APIs without compromising system stability. Cost-Efficiency: Development was completed with zero cloud expenditures by leveraging local simulations and free-tier services like Gemini and Mapbox. Total Interpretability: Every action is supported by a detailed reasoning log, eliminating "black-box" uncertainty.

What we learned ·Key Lessons Learned

LangGraph Utility: It is definitely not overkill. Utilizing LangGraph for stateful, multi-step workflows eliminated the need for hundreds of lines of manual state management and conditional logic. Prompting Supremacy: High-quality prompt engineering remains vital. A well-structured system prompt is the primary factor in transforming a basic response into a precise JSON decision. Prioritize Simulation: Focus on the AI logic first by mocking IoT data, which allows development to proceed without being stalled by hardware requirements. Hackathon Security: Even in competitive environments, security is a priority. Adopting a separate-project architecture to prevent API key exposure effectively addressed judge concerns. Functionality over Complexity: A functional demonstration featuring a live map and reasoning log is far more valuable than a sophisticated but non-operational backend.

What's next for project-synapse Phase

Objective Timeline Live IoT Implementation Transition from simulated inputs to authentic urban data via public APIs, such as power grid monitors and traffic cameras. 2 weeks Scalable Deployment Adjust agent prompts to allow any municipality to integrate Synapse using their specific GeoJSON mapping data. 1 month Federated Network Enable inter-city sharing of anonymous emergency insights without compromising raw data security or centralizing storage. 3 months Citizen Connectivity Launch a mobile interface to deliver real-time evacuation guidance and safety alerts through push notifications and SMS. 6 weeks Public Release Make the complete codebase available under an MIT license, including a simplified Docker Compose setup for easy installation. Post-Hackathon Strategic Expansion Enter the startup contest at the Smart City Expo World Congress. Nov 1, 2026

🎯 Immediate next step: Deploy the live demo link and GitHub repo to the hackathon platform. Then add a 30‑second screen recording of the AI handling a fire + earthquake double‑disaster.


Built With

Share this project:

Updates