Inspiration
Crowd disasters are rarely caused by a single failure. They emerge from human behavior under extreme density, uncertainty, and cognitive overload.
Tragedies like the 2022 Itaewon Halloween crowd crush, the 2021 Astroworld Festival disaster, the 2010 Love Parade crush, and many other large-scale events reveal a consistent pattern:
people don’t fail because they lack information, they fail because too much is happening at once.
Research on crowd dynamics and disaster psychology shows that as density rises and movement destabilizes, people experience:
- loss of situational awareness
- delayed reaction times
- instinctive herding behavior
By the time a situation is clearly labeled an “emergency,” it is often already too late.
ClearPath is driven by one question:
What if we could intervene earlier — before panic, before collapse, before injury?
What it does
ClearPath detects cognitive overload and unstable crowd movement in large events and intervenes in real time to reduce risk.
Using wall-mounted cameras, the system continuously analyzes crowd density and motion patterns. When potentially dangerous behavior is detected, ClearPath flags the affected area and triggers a coordinated intervention:
- A red illuminated zone highlights the congested or unstable region that needs resolution
- A surrounding green guidance zone indicates the safe direction people should move toward
This intervention is designed to be simple, intuitive, and non-verbal, cutting through noise and confusion without overwhelming people.
Rather than monitoring crowds passively, ClearPath actively guides behavior to restore stability.
How we built it
ClearPath combines real-time computer vision, crowd behavior analysis, and an interactive visualization layer.
Perception & Risk Analysis
- Implemented YOLOv8 Nano for real-time person detection
- Applied optical flow analysis (Lucas–Kanade) to track crowd movement dynamics
- Designed four key risk metrics: crowd density, bidirectional flow, flow conflict, and stop-go waves.
Stability Logic
- Danger zones are triggered only when risky patterns persist over time, preventing flicker and allowing realistic response windows.
Frontend & Simulation
- Built a React-based dashboard with real-time video overlays
- Implemented a tactical map view showing a bird’s-eye perspective of the event space with simulated drone deployment to intervene at danger zones
- Used canvas-based rendering and detection caching to maintain real-time performance
Challenges we ran into
- We initially focused on emergency signage and alerts, but realized most venues already handle clearly labeled emergencies well. The real gap lies in ambiguous, pre-emergency situations where danger is building but not yet obvious.
- We attempted to integrate an Arduino-based setup, but real-time computer vision from live video exceeded its practical limits within the hackathon timeframe.
- High-quality footage of real crowd instability is rare. Many public datasets are blurry, low-resolution, or poorly annotated, making validation time-consuming.
Accomplishments that we're proud of
- Translating academic crowd research into a practical, visual intervention
- Achieving real-time performance through careful optimization
What we learned
- Pushed us to learn computer vision while making technical decisions based on real human pain points.
- Effective safety systems must reduce cognitive load, not add information
- Understanding user psychology is as important as model accuracy
What's next for ClearPath
- Integrate with real drone platforms to make sim2real evaluations in live events
- Explore multimodal cues like audio and haptics for accessibility and redundancy
Log in or sign up for Devpost to join the conversation.