still

Inspiration

Most wellness apps ask you how you feel. still started from a different question: what if your body already knows, and you just can't hear it yet? When a stressor appears, the amygdala and hypothalamus fire a physiological cascade before the brain's cognitive centres have finished processing the stimulus, placing the nervous system milliseconds ahead of conscious awareness for people navigating panic, emotional volatility, autistic burnout, or chronic stress. The body is already deep in the response before the mind has caught up, and the physiological surge that follows can linger well beyond that first flash if it goes unnoticed. There was also a second, quieter problem: under sustained pressure, the nervous system's baseline drifts upward slowly over weeks, and the person inside it can't feel the shift because they're adapting to it in real time. Their "fine" today is last month's crisis, and they have no way to see it. We wanted to make both of those invisible things visible, without ever asking a user to name what they feel.

How we built it

We started with research into the psychophysiology of stress, mapped out user flows, then moved into lo-fi wireframes in Figma before building the full hi-fi prototype in Figma Make with our design system imported, so the tool understood our colour tokens, typography (Libre Caslon Condensed and PP Mori), and component logic from the start. The breathing ring animation lives on the iPhone, guiding each inhale and exhale using a 4-8 ratio with no hold phase and the extended exhale intentionally shifting the body toward parasympathetic dominance, while the Watch shows the water level settling in real time as the body calms. When AirPods are connected, noise cancellation activates to close the user off from external stimuli and layer ambient wave sounds over the exercise, grounding the experience in the blue space metaphor. For sensor logic, every decision was grounded in real Apple Watch capabilities:

Sensor What it does for us
PPG optical heart sensor Reads heart rate and HRV, the core metric for both Wave and Waterline
Accelerometer Detects exercise vs. stillness, and flags fidgeting or tremor patterns
Wrist temperature sensor (Series 8+) Records overnight skin temperature shifts that feed long-term Waterline tracking
Taptic Engine Delivers the gentle wrist tap and breath-pacing haptics
AirPods Pro 3 (optional) Provides a second HRV source during the breathing exercise, resolving the haptic sensor interruption gap

We also documented three speculative hardware improvements the Watch would need to make the detection fully reliable, which live outside the prototype scope but grounded our design decisions in what's technically honest versus what we had to work around.

Challenges we ran into

The hardest part was narrowing down the concept itself: we had three strong ideas and the decision came down to which one had the clearest human need, the most honest technical grounding, and the strongest story to tell. Once we committed to still, the research became its own challenge, since the psychophysiology of stress, HRV metrics, and allostatic load aren't light reading, and making sure we understood the science well enough to design responsibly took real time. Storytelling was the third hurdle, because translating dense physiological concepts into language that feels calm, human, and free of clinical weight required more iteration than any other part of the prototype.

Accomplishments that we're proud of

We're proud that still never asks a user to name what they feel, making it equally useful for someone with strong interoception and someone with almost none, without requiring any self-reporting to function. The Waterline concept is something no existing tool does: comparing a user's baseline only against their own history to surface the kind of slow drift that's impossible to feel from the inside, giving people a way to see the shape of their own water over time. We're also proud of the graduation mechanic, where still tracks whether users are catching waves before the Watch does and, if they are, offers to step back:

Most apps are designed around retention, and still is designed around making itself unnecessary.

What we learned

We're designers, not psychology students, so the research was genuinely humbling: one of our team members was taking an introductory psychology course as an elective and her class notes on the nervous system, the autonomic cascade, and sensory perception ended up being one of our most useful references, and it was largely that material that helped us land on interoception as the sense we were designing for. We learned how much the water metaphor carries, because it gave us a shared language for states that are otherwise invisible and let us talk about stress, drift, and calm without using any of those words, and it became the actual interface rather than just a visual choice. We also learned how deliberate you have to be about what you don't show: displaying HRV numbers, stress scores, or anything metric-facing would have shifted the app from a calming tool into something that causes the exact panic it's trying to address, so every design decision around what to surface and what to withhold was made with that tension in mind.

What's next for Still

The most immediate next step is building out a calibration onboarding flow, which we intentionally left out of the demo but would be essential in a real build since the app compares each user only against their own history and needs at least two weeks of data before the Waterline means anything. We also want to give users the ability to delete all their data at any time, no confirmation loops, no guilt. The water belongs to them, and if they want to start over, the app lets go. Beyond that, we want to explore what still looks like for people who share a space, where two consenting users could see when their waters are rising at the same time, not to compare, but to recognize that something is moving through the room and it isn't only them. The goal has always been the same: teach the language well enough that the user stops needing the translation, and still works best when Emmy starts catching her own waves before the Watch does.

Built With

  • figma
  • figmamake
Share this project:

Updates