Anchor

Inspiration

Someone close to our team experiences derealization — that deeply unsettling feeling that the world isn't real, that you've disconnected from your own body and surroundings. Watching them navigate episodes revealed something that existing tools completely miss: derealization is fundamentally a breakdown of interoception, the body's internal sense of itself.

Interoception is one of the least talked about human senses, yet it may be the most foundational. It is the nervous system's continuous, largely unconscious reading of signals from within: heartbeat, breath rhythm, muscle tension, gut state, skin temperature. It is how you know you are hungry before you think about food, how your chest tightens before you consciously register fear. More than anything, it is the sense of being present inside your own body, grounded in physical reality. Most people never notice it because it simply works. But for those who experience derealization, this sense becomes unreliable or disappears entirely, and the world starts to feel like something happening at a distance, like watching your own life through glass.

And yet every existing grounding tool ignores this entirely. They treat derealization as a cognitive problem, offering instructions to follow and prompts to think through at the exact moment the thinking mind is the least reliable thing available. We wanted to build something that worked at the level where the problem actually lives: in the body's real-time, moment-to-moment sense of itself.


What It Does

Anchor is an AR glasses and mobile app system that passively monitors the interoceptive signal stream, reading heart rate variability, breathing cadence, and eye movement patterns to detect when a dissociative shift is occurring. These are not just health metrics. They are the measurable surface of an interoceptive system under stress.

When a shift is detected, Anchor never alarms the user or signals that something is wrong. The world simply becomes more alive. Bioluminescent fish appear on nearby surfaces. A soft ambient tone begins. Without any conscious effort required, the user is gently guided through three grounding exercises designed to bring them back to their senses: visual seeking to re-engage spatial awareness, physical touch to reactivate tactile body sense, and breathwork to restore the breath-heartbeat feedback loop. All of it is disguised as an immersive underwater world.

The design principle is deliberate. Instead of asking users to think their way back to their body, Anchor gives the interoceptive system direct, sensory inputs to work with. And because healing deserves to be celebrated, every completed session earns a creature that takes up residence in a growing personal ocean habitat, giving users something beautiful to watch themselves build over time.


How We Built It

Anchor was designed in Figma, with FigmaMake used in the early stages for rapid brainstorming and iteration. From there, all user flows and prototype screens were built by hand, allowing us to carefully control the pacing, transitions, and emotional tone of each interaction. The AR objects central to the grounding experience were modeled by hand in Blender, giving us full creative control over the underwater creatures and environments that make the experience feel alive.


Challenges We Ran Into

Our biggest design challenge was understanding how to intervene during a derealization episode without making things worse. The wrong signal at the wrong moment could increase panic, pull the user further outside of reality, or make them feel surveilled and medicalized. Every decision, from the way a visual element appears to the words we chose or chose not to use, had to be tested against that question. Alongside that, figuring out how to meaningfully represent AR within a Figma prototype was a significant technical constraint we had to work around creatively, since Figma has no native support for cross-device or mixed reality flows.


Accomplishments We're Proud Of

We are most proud of the passive detection concept, the idea that Anchor finds you rather than waiting to be found. Using AR as the intervention medium felt like the right call: it overlays the real world rather than replacing it, keeping users tethered to physical reality while gently redirecting their attention. We are also proud of the three grounding exercises and how deliberately each one targets a different interoceptive channel, visual and spatial, tactile, and autonomic, making the experience more than just beautiful, but purposeful.


What We Learned

We learned how deeply interoception has been overlooked as a design space. Most wellness tools optimize for behavior change at the cognitive level through habit loops, reminders, and reflection prompts. But for conditions rooted in disrupted body-sense, that is the wrong layer to target entirely. We also learned how much design language matters in mental health tools. Early versions of screens felt inadvertently clinical. The wrong font weight, a slightly too-bright color, a label that was too direct could completely shift the emotional register and undermine the sense of safety Anchor needs to create.


What's Next for Anchor

The immediate next step is validating the passive interoceptive detection model with people who actually experience derealization, ensuring the sensor thresholds are meaningful and that the intervention feels helpful rather than intrusive. From a product perspective, we want to expand the creature collection, build out evening reflection and morning check-in rituals that help users develop interoceptive awareness over time, and explore real hardware partnerships for the sensor suite. Longer term, the interoceptive monitoring model at Anchor's core could extend well beyond derealization, reaching anxiety, panic, chronic dissociation, and other conditions where the gap between body and mind is the central problem to solve.

Built With

Share this project:

Updates