Inspiration
Fire emergencies are unpredictable and chaotic. Smoke, blocked paths, and panic often make it incredibly difficult for people to locate exits efficiently. In these moments, every second counts. We were inspired to leverage Snap Spectacles and AR technology to build a system that guides people toward fire exits in real time, helping reduce confusion and improving safety during evacuations.
What We Built
Our project, SnapSafe, is an AR navigation system that overlays directional cues and audio feedback directly into a user’s field of view through Snap Spectacles. Using computer vision, the system detects obstacles and aligns exit paths, while 3D AR overlays provide clear arrows and markers pointing to the nearest fire exit. To ensure accessibility even in low-visibility conditions, we added an adaptive audio feedback system that guides users who can not see the exits.
What We Learned
Throughout this project, we learned how to integrate Snap’s Lens Studio scripting with computer vision models, as well as how to deploy and manipulate 3D objects on Snap Spectacles for real-world guidance. More importantly, we discovered the significance of user experience in safety applications, especially the need for accessibility and intuitive design when users are under stress. Additionally, collaboration was key to our success. We learned how to divide tasks efficiently, leverage each other’s strengths, and support one another during high-pressure moments.
Challenges We Faced
Like any hackathon project, we faced various challenges. The steep learning curve of Lens Studio made development difficult, and ensuring smooth, real-time performance on the hardware required careful optimization. We also attempted to train our own model for detecting fire exits, but quickly realized that high-quality training data and sufficient time were outside the scope of a weekend hackathon. Still, pivoting and problem-solving under time pressure taught us valuable lessons about adapting to constraints.
Accomplishments That We Are Proud Of
In the end, we are proud of creating a prototype that successfully integrates computer vision with AR overlays to make navigation in high-stress situations easier. We’re also proud of how much we learned during our first hackathon from technical integration to design thinking. We are proud of building a project that demonstrates how AR can be more than entertainment; it can be a critical technology for saving lives.
Whats Next
- Obstacle detection and navigating around them to exits
- World-query-hit to detect ground vs. higher-level surfaces (obstacles), and warns users with a green vs. red highlighting on-screen
- Voice AI Feedback and haptic feedback to empower people with vision-impairments
Built With
- apis
- computer-vision
- depth-caching
- javascript
- lens-studio
- machine-learning
- snap-ar
- typescript
- world-query-hit


Log in or sign up for Devpost to join the conversation.