Links:
This project primarily uses a Lens Studio project that can be found at the main repository linked to this Devpost:
- https://github.com/RishiAitha/safe-sight However, there is an iOS companion app used for emergency video recieving:
- https://github.com/chaconshania/EmergencyVideoApp
Inspiration
As college students, and three of us being women, we were inspired by a shared frustration: safety tools often require you to look at your phone, often making situations more dangerous. We’ve all experienced moments where we felt unsafe walking alone, navigating unfamiliar places, and reacting to unexpected situations. In those moments, pulling out a phone can reduce situational awareness, slow reaction time, and signal vulnerability to the people around you. Existing safety apps can feel distracting and overwhelming, often surfacing information that isn’t relevant at the moment. We wanted to build something that allowed you to warn others, get safer directions and access help without breaking presence.
What it does
Our project is a Snap Spectacles AR safety lens, allowing users to:
- Set your destination location, and receive directions via an arrow directing along a Google Maps route
- Flag nearby dangers using buttons anchored to your palm: suspicious individuals, environmental hazards, and armed individuals
- Instantly notifies other users of your reported hazard within a 1 mile radius
- Receive hazard notifications from nearby users within a 1 mile radius
- Receive real-time safer routing in AR during unsafe situations
- Receive location of the nearest campus emergency blue lights for active in-person help
- Trigger an emergency mode records 5 seconds of the user’s PO and sends recording and location to trusted emergency contacts on a companion app
The goal is to create a shared safety network while maintaining individual awareness.
How we built it
We built the project using Lens Studio for Snap Spectacles. We designed this system intentionally to reduce phone dependency, relying on quick-access actions, a smart notification system and Snap Cloud integration for real-time updates and shared information that becomes more useful as more users share info. Our key components included:
- Minimal/non-intrusive map/UI interface for navigation and hazard identification
- Snap cloud database for emergency video upload and public hazard sharing
- Google Maps Routes API integration for navigation along walking routes and re-routing when avoiding hazards
- Best possible route from API is chosen to avoid hazards along walking path
- iOS companion app to load emergency videos and receive location information from contacts
- Frame capture/buffer/upload system to quickly send emergency recordings to the Snap cloud database
Challenges we ran into
Our primary challenge was connecting the complex backend, as integrating the Google Maps Routes API to find waypoints along walking routes for simple non-obtrusive navigation was difficult to manage. Also, managing global positioning with lat/lon coordinates for real-time hazard updating was difficult, as we had to ensure that navigation re-routing occurred efficiently and that hazard locations were pulled correctly. We were eventually able to nail down our user flows so that our backend could handle all types of situations as best as possible. This helped us eventually add the blue light system, where we just took coordinates of MIT blue light stations to store and utilize in the database for users to see. We also had a great deal of trouble managing video upload, as encoding and buffering systems were difficult to set up. Eventually, by using the provided samples, we were able to prevent crashes and efficiently send videos over to the companion app. Creating the companion app was also difficult, as wiring Supabase/Snap Cloud fetching and video stitching was a long process with lots of testing involved.
Accomplishments that we're proud of
- Designing and prototyping a concept that has genuine potential life-saving impact
- Creating a fully functional AR safety workflow on Spectacles within short timeframe
- Implementing real time alerts and routing using Snap Cloud
- Designing a system with real time alerts based off locations
- Creating an efficient video upload system for Snap Cloud
What we learned
- AR can meaningfully improve safety by keeping users present and aware
- Community-based alerts are powerful but require thoughtful UX design
- Building for real-world risk involves both technical execution and emotional intelligence
- Diverse teams bring critical perspectives—our lived experiences directly shaped the product
What's next for SafeSight
SafeSight is just the beginning of how AR can be used to build safer and more connected communities!
Next, we want to:
- Have edu email verification for college campuses
- Expand routing intelligence using spatial intelligence and historical hazard data
- AI audio assistant to receive advice in real time to get help for all types of obstacles
- Improve alert validation and filtering to prevent misuse
- Integrate with official campus safety systems
- Explore broader use cases beyond college campuses (like traveling in new locations)
- Conduct user testing to refine interaction speed and clarity
Built With
- google-maps-route-api
- lens-studio
- snap-cloud
- supabase
- swift
- typescript
Log in or sign up for Devpost to join the conversation.