We started with a feeling most people recognize but can't explain — walking away from certain people exhausted, and others somehow restored, without knowing why. Science has a name for it: co-regulation. The biological process by which nervous systems synchronize through proximity, shared rhythm, and physiological mirroring. It's real, it's measurable, and until now, no tool has ever surfaced it to the people experiencing it. We went deep into the AI landscape, existing wearables, and biosensor research to understand what was possible — and found a gap nobody had filled. Within was built to fill it.

What it does

Within is a companion app paired with Verocol, a passive biosensor collar that sits invisibly inside your neckline. It continuously reads your heart rate variability, breathing rhythm, electrodermal activity, and skin temperature — then cross-references those signals with anyone nearby wearing a paired unit, human or animal.

The app surfaces one plain-language insight at end of day. Not a dashboard. Not a data dump. One observation about your nervous system and the people you spent time with. Over time, it builds a Relationship Map: every person and pet in your network, color-coded by how your body actually responds to them. It also generates a Mind File — a plain-English document capturing your emotional baseline and relational patterns, exportable to any AI assistant so conversations finally start from where you actually are.

How we built it

We began with the SRS — defining the full behavioral specification for both the hardware unit and companion app before writing a line of UI. From there we moved into use case development, grounding every design decision in real user stories: the therapist burning out after eight sessions, the dog trainer whose stress travels down the leash, the person who finally wants their AI assistant to know who they are. The app UI was designed around a calm-tech philosophy — ambient visualizations, no clinical charts, no interruptions during the day. The Encounters screen, Timeline, SmartPrompt, and Mind File system each went through multiple rounds of intentional reduction.

Challenges we ran into

The hardest tension in the entire design was the line between insight and anxiety. Giving people data about their nervous system responses to people they love is powerful — and potentially destabilizing. We spent significant time on the language layer: making sure nothing in the app pathologizes a relationship, frames drain as someone's fault, or creates a feedback loop of biological self-surveillance. Every label, every color, every prompt was written to feel like self-knowledge — not self-judgment.

Designing for the dog use case added another layer. Human-animal co-regulation is scientifically documented but emotionally delicate territory. Framing that data as guidance for human behavior change — not behavioral judgment of the animal — required care at every touchpoint.

Accomplishments that we're proud of

The Mind File. The idea that your emotional and relational context could travel with you into any AI conversation — that a tool could finally give an AI assistant enough of you to respond with real relevance — feels genuinely new. Seeing it land in the Encounters screen alongside the SmartPrompt suggestions made the whole system click.

We're also proud of the restraint. Within deliberately does less than it could. One insight. One end-of-day prompt. No social sharing. No leaderboards. No raw data by default. In a space crowded with apps that overwhelm, designing for calm was the hardest and most important call we made.

What we learned

That the most meaningful data is often the data people already feel but can't name. Within doesn't tell users anything their body doesn't already know — it just gives that knowledge a language. The design challenge wasn't building something new. It was building something honest enough to be trusted with something that personal.

What's next for Within

Group co-regulation — extending dyadic tracking to small group contexts like family dinners or team meetings. Longitudinal coaching integration — giving therapists consented access to their clients' co-regulation trends to inform session work. And a cross-species coherence research layer — an anonymized aggregate dataset for academic study of human-animal physiological synchrony. The collar is just the beginning. The sense has always been there.

Built With

  • figma
  • slides
Share this project:

Updates