Inspiration
Our mothers always told us to use a buddy system when we are walking late at night alone, especially as a young child or woman. Whether walking back alone from the library to the dorm or wanting to take a quick, safe night stroll, we were always aware and we had to be. No one ever gave us a tool that actually measured what makes us calm on one street and tense on another, why a route that looks fine on a map never quite feels fine at midnight, or what our body already knows that our brain hasn't caught up to yet. There are tools out there that tells you the fastest way to get somewhere, but not yet one that listens to your nervous system and learns which routes your body trusts. Despite the decrease in crime rate in the United States, fear or stress of crime has stayed the same and recently seems to have increased than actual crime rates. Many women move through the world with keys gripped between their knuckles, wary glances down empty streets, and “text me when you’re home” messages.
What it does
Fear but no proof it was calibrated to something real. Confusion of not knowing which streets or corners spiked their stress 11 times in a row. Cadence turns these body's responses/sensory states into actionable data. It reads you (eye movement, walking pace, calmness fluctuations as well as the things your body does automatically when it feels threatened, even before you consciously register it). It learns you as every walk builds a personal map history. Over time it knows which streets consistently calm you down and which ones your body has never trusted, even if they look fine on a map. Before you leave, it shows you your options that are not just the fastest but overlaid with lighting conditions, foot traffic, weather, and your own historical stress responses on each path and weekly/monthly insights. It stays with you where during the walk, your phone stays in your pocket. With the ability to connect with AI-powered glasses, the navigation would be displayed in your vision with movement and sound subtle alerts.
How we built it
After lots of brainstorming, user research, and developing key features, we Then we built the app entirely in Figma by using the tools and interactive prototypes to simulate the real-time experience of the app. We mapped biometric inputs to interface states and throughout the process, we made sure the insights did not overwhelm the user.
Challenges we ran into
In the initial brainstorming phase, we took too long in deciding and sticking with "human sensory experience". We also struggled with the question of false signals. Your body spikes for lots of reasons whether it be a loud noise, a cold breeze, or a sudden memory. We thought of ways to implement futuristic aspect of AI-powered glasses.
Accomplishments that we're proud of
Both of us are intermediate level users of Figma, so pursuing this project under time pressure was something that was new to us. Collaborating with another team member and brainstorming ideas was the most difficult part, but had helped us gauge better user empathy. While working under zero creative constraints, we were given the chance for collaborative freedom, integrating ideas into one another.
What we learned
Typical navigation apps doesn't know that you've arrived home with an elevated stress signature every time you've used the shortcut after 9pm. It doesn't know that the intersection near your apartment spikes your saccadic eye movement eleven out of thirteen times. It doesn't know that you slept badly last night and your threat threshold is lower than usual today. It doesn't know that the route it's suggesting is technically safe but has never once felt that way. We learned that there was already a gap between what technology offered and what people actually needed on a walk home alone at night. Cadence treats every user as an individual with a body that has already been quietly logging data for years, rather than treating every user the same. We came in thinking we were building a safety app, but instead, we left having built an app that listens and promotes comfort and safety in both new and old routes!
What's next for Cadence
We would like to expand individual data to allow for anonymous collective patterns across thousands of users that could tell cities by reporting exactly where their lighting is failing, where foot traffic drops dangerously at night, where women consistently feel unsafe even on streets that look fine. Cadence could become a tool not just for individuals but for urban planners and law enforcement to improve well-being when walking or add more accessible routes. Additionally, accessing an unfamiliar route on the phone can be distracting, so we would like to add a wearable component. Right now the app lives on your phone, but the vision was always for it to disappear into the background entirely (integrated into glasses that overlay the safest route directly into your field of vision).
Built With
- figma
- prototyping
Log in or sign up for Devpost to join the conversation.