Inspiration
Safe Walk was inspired by a simple but urgent truth: many people—especially women—don’t feel safe walking alone at night. According to the UK Office Of National Statistics, half of women in the UK reporting this fear. We saw an opportunity to use XR and AI to provide meaningful, real-time support in those moments. Existing phone apps often demand too much interaction when users are stressed, so we reimagined personal safety through smart glasses: hands-free, immediate, and quietly present.
What it does
With a simple gesture or safe word, Safe Walk instantly sends your phone GPS location and a live video stream to trusted contacts, removing the friction of unlocking a phone in moments of stress. An AI companion stays with you until your trusted contact become available on video stream —speaking calmly, providing reassurance, and guiding you toward safer environments such as open shops. The experience is designed with a minimal interface, so it feels less like using an app and more like having a supportive presence beside you.
How we built it
Safe Walk is built in Unity and can be activated hands-free using either gesture detection via the XRI Toolkit or a custom wake word, created via Open Ai whisper. Once safety mode is triggered, passthrough camera frames are streamed through Agora for video calling (https://www.agora.io/en/), while the user’s GPS location is shared from the paired mobile app which is updated in real time using Pusher (https://pusher.com/) and OpenRouteService (https://maps.openrouteservice.org/). An SMS alert with access links is sent via Twilio (https://www.twilio.com/en-us), and the contact can view the session through a Vercel-hosted web app (https://vercel.com), with the companion mobile app built in Expo (https://expo.dev/). Guidance to a nearby safe space is then generated by combining Google Places data (https://developers.google.com/maps/documentation/places), OpenAI LLM reasoning, and OpenRouteService routing for step-by-step navigation toward the safest open location.
Build Updates
A major update to Safe Walk was evolving our AI companion from a voice-only assistant into an embodied presence. Research shows that walk-sharing increases perceived safety by 40–80% (Bhowmick et al., 2021), and that embodied agents create stronger trust and social presence in stressful moments (Bousardt et al., 2025; Reeves & Nass, 1996). Inspired by these findings, we added a subtle visual companion that appears within the user’s field of view—calm, unobtrusive, and reassuring. This transforms the AI from a disembodied voice into something that genuinely feels “with” the user, strengthening the emotional impact of Safe Walk and meaningfully improving users’ sense of safety during nighttime or vulnerable walks.
Lessons Learnt
The biggest lesson we learned is that clear early alignment and rapid prototyping are essential—especially for remote teams. We realized that if we had agreed on the core features, user flows, and expectations at the very beginning, we could have saved a lot of time later. Prototyping early in ShapeXR or even Figma would have helped us build a shared vision and avoid misunderstanding.
We also learned how important it is to test technical components early. On multiple devices. In different countries. A change like this can turn an app that did work in one device, into something non-functional on another. Catching those issues earlier would have made development smoother and less stressful.
Challenges Faced
Although Safe Walk is designed for future wearables, we prototyped it on the Meta Quest 3, which lacks built-in GPS. To bridge that gap, we built a companion android app that streams the phone’s location data directly to the headset.
We also wanted trusted contacts to help without installing special software, so we created a lightweight web app that opens from an SMS link, instantly showing the user’s live video and location.
Finding a meaningful role for AI took exploration: vision-based threat detection wasn’t reliable enough, so we focused on what AI could do well, providing calm guidance and directing users toward safer nearby locations.
Future Plans
Looking ahead, Safe Walk aims to become a daily-life companion that strengthens personal safety. Our next steps include conducting user studies to understand how the app shapes real-world feelings of safety and using those insights to refine the experience. Currently, our trusted-contact SMS and live video call feature supports users in Germany, the UK, and Ireland, and we plan to expand this capability to additional countries as we grow. To encourage mainstream wearable adoption, we’ll bring Safe Walk to Ray-Ban Meta and other XR glasses, offering a compelling everyday reason to embrace smart eyewear. As judges noted in the XRCC 2025 hackathon, Safe Walk is a feature people might buy smart glasses for—and we aim to extend that impact into a broader, platform-wide lifestyle benefit.
Scientific References
Bhowmick, D., Winter, S., Stevenson, M., & Vortisch, P. (2021). Investigating the practical viability of walk-sharing in improving pedestrian safety. Computational Urban Science, 1(1), 21.
Bousardt, R., van der Sluis, F., van der Veer, G., & Meyer, J. (2025). Embodied conversational agents to support users in complex digital environments: A systematic review.
Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places.
Built With
- metadsdk
- nextjs
- pusher
- react-native
- unity
- vercel


Log in or sign up for Devpost to join the conversation.