Midnight Mile
"You're never truly walking alone."
Inspiration
Walking home at night should feel safe but for many women it does not. After speaking with friends and reflecting on personal experiences, we noticed a gap: current map apps help with navigation but not safety. We imagined a solution where safety is not a feature but the foundation.
Thus, Midnight Mile was born: a map-based app that offers AI-powered safe walking at night, blending real-time data, trusted help, and immersive reassurance.
What it does
Midnight Mile provides a map-centric app experience where users enter a destination and receive the safest walking route based on crime data, lighting, and foot traffic. It highlights nearby safe spots such as police stations, hospitals, and 24/7 stores. Users can activate an AI voice companion that simulates a trusted friend walking alongside them, periodically checking in, responding to distress cues and listens to you for rerouting to a safe spot. Upon arrival, users confirm safety, or alerts are sent automatically to trusted contacts.
How we built it
Core Stack
- Google Maps APIs:
- Directions API for road-to-road route rendering
- Places API to fetch safe zones (police stations, hospitals, 24/7 stores)
- Maps JavaScript API for Embedding custom map to our website
- Directions API for road-to-road route rendering
- ElevenLabs API:
- Used for real-time voice streaming
- Custom voices that simulate a calm male companion
- Background noise analysis for distress keyword detection (planned)
- Used for real-time voice streaming
- Front End:
- Next.js
- Database:
- Supabase (PostgreSQL) -LLM:
- Gemini
- Voice and Safety Logic:
- Voice asks “Are you okay?” periodically
- Sends alerts to trusted contacts if no response or panic keywords detected
- “Safe Now” auto-check-in on arrival using geofencing
- Voice asks “Are you okay?” periodically
Challenges we ran into
- Streaming voice in real time without latency on no-code platforms
- Making the AI companion feel natural and human, not robotic
- Obtaining reliable, up-to-date safety data for cities like Delhi
- Balancing immersive AI features with a simple, clean UX
Accomplishments that we're proud of
- Successfully integrated multiple Google Maps APIs for safety-focused navigation
- Developed a lifelike AI voice companion using ElevenLabs that enhances user comfort
- Designed an intuitive, map-first UI centered on real-time safety information
- Created a scalable user flow combining destination input, route guidance, and safety check-ins
What we learned
- The importance of combining real-time data with empathetic AI to improve safety perception
- Challenges of working with voice streaming APIs on no-code platforms and ways to overcome latency
- How to design for emotional impact using minimalistic but meaningful UI elements
- Practical insights into integrating AI and geospatial data for social impact
What's next for Midnight Mile
- Implement panic trigger detection using real-time audio analysis
- Expand safety data coverage to more cities worldwide
- Personalize the AI companion’s language, voice, and emotional tone for diverse users
- Enhance immersive features such as Street View integration for richer navigation experience
- Have verified safe spots other than public such as people who can register to provide help
Built With
- 11labs-api
- gemini
- google-maps-platform-api
- nextjs
- supabase
- typescript

Log in or sign up for Devpost to join the conversation.