Inspiration
Walking home alone can be scary and comes with real dangers, especially at night. Beacon was designed to be a safety companion for people walking alone in potentially dangerous situations, to bring peace of mind and act as an additional safety measure in case of emergencies.
What it does
Beacon maps and tracks the routes of someone walking alone, from their current location to a destination of their choice. The user is able to confirm their destination, then start the trip, which brings up a real-time interactive map of their route. Throughout the trip, the user is accompanied by our AI companion, who gives them gentle assurance, encouragement, and guidance, and most importantly monitors their situation for signs of danger. In the case of encountering danger, Beacon immediately calls an emergency contact for the user and calmly informs them of the situation along with the user's location. With Beacon watching out for them until they're safely at home, users are able to
How we built it
We built Beacon using Expo React Native for the mobile frontend, powered by the ElevenLabs Conversational AI SDK for the real-time voice companion. The app uses the Google Maps and Directions APIs for route planning and live map rendering, with a custom route tracking hook that monitors deviation and estimates arrival. For emergency calling, we built a Node.js/Express backend that bridges Twilio's outbound calling with ElevenLabs' voice AI over WebSocket, so when an emergency is triggered, an AI agent autonomously calls the emergency contact and communicates the situation and GPS coordinates in natural language.
Challenges we ran into
Getting real-time voice with ElevenLab's react-native SDK working on a physical Android device was the biggest hurdle. The ElevenLabs SDK requires native WebRTC modules that aren't available in Expo Go, so we learned to set up EAS cloud builds. After lots of debugging and testing with seemingly no reason for code failure, we discovered that university wifi can actually block WebRTC UDP ports entirely, and by switching to mobile data for the voice connection we were able to secure a connection to our ElevenLabs agent. Wiring up the audio pipeline between Twilio's media streams and ElevenLabs' WebSocket in real time, including handling interruptions, pings, and audio chunking correctly, also took significant debugging.
Accomplishments that we're proud of
We're proud of shipping a fully functional end-to-end safety app in a single hackathon session. The voice companion actually works in real time on a physical device, the route tracking correctly detects deviation, and the emergency calling flow autonomously reaches a real phone and delivers location information using AI voice. We're particularly proud of building something that addresses a real problem people face every day. This project is something we would actually use, and building technology to protect people and keep them safe is something we're proud to have done.
What we learned
Through working on Beacon, we learned how to integrate WebRTC-based voice AI into a native mobile app, how to work with real-time audio streams and transfer them between Twilio and ElevenLabs over WebSocket, and how network environments like university WiFi can actually silently break WebRTC. We also learned to think and code quickly, as we were iterating and debugging under time pressure.
What's next for Beacon
For Beacon's next steps, we want to add persistent emergency contacts so users are able to keep a database of contacts that the agent cycles through. We also hope to implement additional proactive AI intervention, where Beacon detects danger signals like sudden stops, deviation from route, or distress in the user's voice and triggers the emergency call automatically with more precision and intuition.
Built With
- elevenlabs
- react-native
- typescript
Log in or sign up for Devpost to join the conversation.