Semicolon Project Story

Inspiration

Semicolon started from a simple problem: people know which streets feel dangerous, but that knowledge usually stays anecdotal. Accidents, deaths, incidents all problems that need to be solved; with semicolon. Semicolon starts to combatting this with live hazard detection and dashcam potential for all vehicular devices whether it be a car, scooter or a bike!

What it does

Semicolon detects and documents road hazards in real time.

  • Monitors camera frames for vehicles, pedestrians, and other hazards.
  • Provides immediate rider-facing awareness through a live HUD.
  • Logs geotagged events with context so incidents are not lost.
  • Powers replay, records, and danger-map views to reveal recurring risk areas.

In short, it helps people react to risk now while building a clearer picture of unsafe streets over time.

How we built it

Semicolon is an end-to-end system across mobile, backend, and visualization layers.

  • Mobile clients: Native iOS (Swift/Xcode).
  • Perception service: Python FastAPI sidecar running YOLO for frame-level detection.
  • API layer: Next.js routes for perception orchestration and event flow.
  • Data layer: Event persistence and retrieval for records and replay.
  • UI layer: Dashcam-style overlays, detection boxes, scoring, and navigation/gallery controls.

At a high level, our HUD risk score can be represented as: Score = min(100, max over i of (42*c_i + alpha_i + beta_i + gamma_i))

Where:

  • c_i = confidence of detection i
  • alpha_i = hazard-class bonus
  • beta_i = center-proximity bonus
  • gamma_i = box-prominence bonus

Challenges we ran into

  • Keeping behavior consistent across Swift, Next.js, and Python.
  • Balancing latency, frame throughput, and network reliability for live inference.
  • Managing camera edge cases (orientation, multicam limits, front/back handling).
  • Resolving class-label mismatches and downstream scoring assumptions.
  • Handling setup friction (dependencies, local networking, ngrok, environment config).

Accomplishments that we're proud of

  • Built a full pipeline from live camera input to usable safety events.
  • Shipped real-time HUD detection behavior, not just offline analysis.
  • Added conditional front-camera PIP in the native iOS app when detections appear.
  • Standardized around raw COCO labels and aligned downstream logic.
  • Delivered a demo-ready system that can evolve into long-term civic tooling.

What we learned

  • Integration quality matters as much as model quality.
  • Stable data contracts are essential across multi-service systems.
  • Safety UX must be clear, fast, and low-distraction.
  • Near-miss data is highly valuable and often missing from traditional datasets.
  • Clear ownership boundaries make multi-stack collaboration much faster.

What's next for Semicolon

  • Improve low-light robustness and nighttime detection performance.
  • Add stronger on-device inference paths for latency and offline reliability.
  • Expand event intelligence (similar-incident retrieval and better hotspot clustering).
  • Validate with more real-world ride data and edge-case testing.
  • Improve reporting workflows so communities and planners can act on the data faster.
Share this project:

Updates