Inspiration
Most of us have felt it: a day that's inexplicably heavy, a decision that felt wrong, a version of ourselves we couldn't quite locate. The gap between what the body is experiencing and what the mind is conscious of is something we've both lived with for a long time. That frustration is what led us here.
What it does
Ora is a wearable biosensor patch and a companion app that gives users access to a sense they already have but have never been able to read. The patch sits on the sternum and reads seven physiological signals continuously: heart rhythm, vagal tone, cortisol, breath texture, skin conductance, skin temperature, and blood oxygen. That signal stream surfaces through two simultaneous output channels, a haptic channel that vibrates directly on the skin when the user's state shifts, and a visual channel that renders the data as a single organic shape on the phone screen, updating in real time.
Who it's for: People in their mid-twenties to late thirties who are functional and self-aware but running just slightly behind themselves: making decisions, managing relationships, carrying load they can't fully articulate.
The new sense: Interoception. Emerging science now identifies it as one of the most fundamental human senses: the foundation of emotional regulation, decision-making, and self-awareness. Ora trains and develops it over a 12-week arc, giving users a channel of perception that runs alongside daily life without interrupting it.
The wellness goal: To close the gap between what the body is doing and what the mind is conscious of across emotional, mental, and social dimensions of wellbeing.
Ora is designed around the principle that more data is not always better. The blob communicates everything through a single shape no numbers, no charts, no interpretation required. The haptic channel delivers signal without requiring the user to look at anything. The five app views: Feel, Body, Haptic, Sense, and Signal, each present the same continuous stream through a different lens, depending on what the user needs at that moment.
How we built it
Ora was designed entirely in Figma — the interface in Figma Design, the interactive components in Figma Make, and the presentation in Figma Slides. The organic blob visualization is a custom-built generative component driven by five parameters — intensity, fragmentation, warmth, density, and coherence — that map directly to the physiological signals the patch reads. The onboarding flow, the Feel tab decision engine, the Sense vocabulary system, and the Together shared session experience were all prototyped as live interactive flows in Figma Make embedded directly in the submission slides.
Challenges we ran into
The hardest design problem was representing a continuous, multidimensional signal in a way that could be read instantly without analysis. Most data visualization defaults to charts, numbers, and time series. We had to build something that communicated state the way a face communicates mood: immediately, through form and movement, without requiring a translation step. Getting the blob parameters to feel legible rather than arbitrary took significant iteration.
The second challenge was the Together tab: designing real-time physiological sharing between two people without creating an asymmetry that could be used manipulatively. Consent architecture for something this sensitive required careful thought about what is visible, when, and to whom.
Accomplishments that we're proud of
We're proud that Ora answers the prompt literally and seriously. Interoception is a real, measurable, trainable sense: one that emerging science identifies as foundational to emotional and cognitive functioning. We're also proud of the design philosophy coherence. Every decision: the blob, the haptics, the 12-week arc, the vocabulary system, traces back to the same underlying principle that the gap is in the translation, not in the body.
What we learned
That the hardest design problems are not interface problems, they're framing problems. How you describe what a tool does determines whether people understand what it's for. We rewrote the onboarding four times before it stopped feeling like a feature list and started feeling like an explanation.
We also learned that restraint is a design decision. Every time we considered adding more: more data, more views, more feedback, the product got worse. Ora works because it does less than you expect and delivers it through channels that don't ask for your attention.
What's next for Ora
The 12-week interoceptive literacy arc is the core of what Ora could become. The next step is developing the progression model more fully, defining what vocabulary milestones look like, how the app adapts its feedback as the user's awareness develops, and what it means for the patch to become genuinely less necessary over time.
The Together tab also has significant room to grow, shared physiological context between people who are in ongoing relationships, therapeutic contexts, or team environments is an area where Ora's approach could have real impact.
And ultimately, the question we want to keep asking is the same one we started with: what would it mean if the body's signals were already legible. We think we've built the first version of an answer.
Built With
- figma
Log in or sign up for Devpost to join the conversation.