SafeStep > Navigate home without fear.
Inspiration
- Personal stories and community reports about the “last mile” — people (especially women and gender-diverse folks) feeling unsafe walking from transit to home late at night.
- Existing navigation apps prioritize speed and willfully ignore contextual safety signals (lighting, crowds, informal community reports).
- The rise of AR-capable phones — opportunity to give real-time spatial guidance that’s visible and intuitive without forcing eyes off the environment.
- Desire to create a privacy-respecting, verifiable reward (non-transferable badge) that recognizes completed safe journeys and encourages trustworthy reporting.
- Hackathon prompt to combine AI + AR + a strong evidence/proof story — perfect fit for a product that’s practical and demonstrable.
What it does
- Analyzes multiple safety signals (lighting, community reports, historical incidents, crowdsourced observations) to compute a safety score for routes.
- Offers two route modes: Safest route (default) and Fastest route (optional)—explicitly showing tradeoffs to the user.
- Presents an AR-guided path over the phone camera so users follow a clear, glowing route in the real world.
- Alerts users when they step off the safe path with clear microcopy and visual cues.
- Optionally mints a Soulbound Safe Passage badge (testnet) after a verified safe arrival — a verifiable proof artifact for social / civic use.
- Includes an accessible fallback (text-based checkpoint list) for users who cannot or prefer not to use AR.
- Privacy-first: location streams used for routing are ephemeral; persistent storage limited to optional hashed proofs for minting and anonymized analytics.
How we built it
Architecture (summary)
- Client (mobile web / PWA + native wrapper) — React Native / Expo for quick dev + WebAR fallback for no-install users.
- AR presentation — WebXR / WebAR surface for browser fallback; native ARKit/ARCore when available for best stability.
- Mapping & routing — Map tiles & POIs from OpenStreetMap, routing overlay combines map routes + safe-path adjustments.
- AI decision layer — Goose (AI agent) mediates multi-signal analysis: lighting estimates, incident summarization, and community report weighting.
- Backend — lightweight Node/Express prototype for API, with a toggle to run entirely on data for demos.
- Proof & minting — WalletConnect for demo, Polygon testnet for low-cost minting of a soulbound badge; metadata includes route hash, time, and safety score.
- Data sources — public datasets (lighting/POI), user-submitted reports, optional local open crime datasets, and real-time device sensors.
- Developer & design — Figma Make prompt generated high-fidelity prototype, auto-export assets for Devpost screenshots and a one-page Criteria Evidence summary.
Tech stack (prototype)
- Frontend: React Native (Expo) + WebXR fallback
- AR: ARKit / ARCore + three.js overlays or WebXR layers
- Maps: OpenStreetMap + Mapbox-style tiles (or MapLibre)
- AI orchestration: Goose agent
- Backend: Node.js + Express
- Wallet / blockchain: WalletConnect + Polygon testnet (SBT)
- Design & prototype: Figma (Make prompt for rapid generation)
- Testing: Manual QA, device smoke tests across iPhone and Android, accessibility linting
Challenges we ran into
- Data sparsity for lighting — public lighting datasets are inconsistent across cities; required synthesizing lighting from POIs, satellite imagery proxies, and community reports.
- Balancing safety vs. surveillance — using police incident data can disproportionately affect communities; we had to design weighting and explainability to mitigate bias.
- AR stability in low-light — AR tracking is less robust at night; needed fallback UX when camera tracking dropped or drifted.
- Third-party services — for the hackathon we had to convincingly Goose and wallet flows while keeping demo believable.
- Privacy tradeoffs — proving a route without storing raw GPS required building a verifiable hashing approach while staying simple for demo judges.
- Time constraints — polishing the handoff artifacts (developer tokens, export images, evidence PDF) while implementing core flows was tight.
- Device fragmentation — varying camera permissions and WebXR support across Android and iOS required careful prototype branching and clear permission microcopy.
Accomplishments that we're proud of
- End-to-end prototype from onboarding → Goose analysis → AR guidance → minting simulation, demo-ready in Figma and on-device.
- Clear evidence layer built into the design that maps UI artifacts to judging criteria (Clarity, Proof, Usability, Rigor, Polish).
- Privacy-first proof model: optional hashed route proof that’s adequate for mint metadata without storing identifiable traces.
- Soulbound mint proof of concept (testnet) that demonstrates how a verifiable, non-transferable badge can incentivize safe behavior and responsible reporting.
- Accessible fallback flow for users who cannot use AR — shows we considered inclusion from the start.
- Decision log documenting tradeoffs (e.g., community reports weighted higher than raw incident counts to avoid over-policing).
- Design system & export: tokens, assets, and demo screenshots generated for Devpost and judges — polished and consistent.
What we learned
- Explainability matters: judges and users want to see why a route is safe. An AI analysis chain (Goose → lighting → crowd reports) dramatically increases trust.
- Design for failure modes: AR works great when conditions are right — but you must design graceful fallbacks (text checkpoints, audible guidance).
- Bias mitigation is not optional: naive use of police incident data can harm communities; explicit weighting and documented decisions are necessary.
- Privacy wins trust: minimizing stored location data and offering opt-in hashed proofs improved user comfort in testing.
- Small UX details matter: clear permission microcopy (“SafeStep needs camera + location…”) and large touch targets made our prototype feel production-ready.
- Judges appreciate evidence-first submissions: the Criteria Evidence page and a one-page PDF summarizing proof were frequently complimented during quick demo walkthroughs.
What's next for SafeStep | Navigate home without fear
Short-term (next 3 months)
- Implement real lighting ingestion using municipal open data and satellite-derived illuminance approximations.
- Run a small, opt-in pilot with a local transit agency or university campus to collect community reports and improve the model.
- Harden AR fallback UX: add audible directions, haptic cues, and a low-bandwidth text-checkpoint flow.
- Produce a one-page Criteria Evidence PDF and short demo video (5 storyboard frames) for judges and partners.
Mid-term (3–12 months)
- Launch a privacy-safe beta: opt-in reporting, hashed route proofs, and audit logs for bias analysis.
- Integrate more public safety datasets with clear provenance and versioning.
- Add multi-language support and improved accessibility (screen-reader-first flows, reduced-motion defaults).
- Explore partnerships with transit agencies, universities, and nonprofits focused on community safety.
Long-term (12+ months)
- Scale the minting model into a trusted civic credential (with careful governance and anti-abuse safeguards).
- Build policy and research collaborations to publish findings on safe routing and community reporting impacts.
- Commercial considerations: B2B offerings for campuses or events, and grants / sponsorships for public deployments.
75HER Challenge: Project Documentation for SafeStep
https://docs.google.com/document/d/1rIfkmnF5CUSm3SencnjGbDEIudj57ccN-C05HqDMTK4/edit?usp=sharing
Built With
- blockchain
- goose
- lovable
- nft
- replit


Log in or sign up for Devpost to join the conversation.