Inspiration

In November 2025 I was diagnosed with bipolar disorder. It hit me really hard. But it also made me want to understand everything about what was happening to me, so I started researching. A few months later I told my teammate Jose. He didn't just listen, he jumped in with me. We started researching bipolar together. Then Figma dropped their prompt around senses. We started pulling threads: interoception, chronoception, proprioception. The more we dug, the more it connected back to bipolar. It clicked for me in a way that felt personal. I want this device. I genuinely want it. Tracking triggers is something I am supposed to do anyway but it gets hard to stay consistent. A wearable that does it passively all day, and lets me share the data with my therapist and psychiatrist, could actually change how fast we figure things out. My meds, my patterns, my life. That is where Resonance came from.

What it does

Resonance is a wearable system that reads your nervous system in real time and gives the data back to you, privately, without anyone else deciding what it means. Two core features: Aura captures interoceptive signals: HRV, EDA, breathing rate, skin temperature, and three environmental channels: touch, sound (decibel only, no audio stored), and crowd density via Bluetooth and WiFi signal count. Liminal detects when your body crosses a personal threshold and sends a haptic pulse. Over time it builds a pattern dashboard: your triggers, your rhythms, your data. Three form factors: a Pendant, a Clip, and a Band, each designed for a different body and a different relationship with sensation. All feed into AR Glasses that surface signals at the periphery without interrupting your field of view. All data lives on your device. Nothing leaves without your explicit consent.

How we built it

We built Resonance as a speculative design project in Figma. The process started with research: deep dives into interoception, chronoception, proprioception, and the neuroscience connecting sensory processing to bipolar disorder, autism, and alexithymia. From there we built three case studies grounded in real physiological behavior: Nel (bipolar II), Darius (autistic), and Priya (ER nurse with alexithymia). Each one shaped a different device, a different data layer, and a different use of Aura and Liminal. We designed the full system architecture including sensor specs, security layers, and compliance with HIPAA, GDPR, CCPA, BIPA, GINA, ADA, FDA Digital Health guidelines, and FCC regulations. Then we brought it to life in Figma: the app UI, the AR overlay, the consent flows, the pattern dashboard, and the device form factors.

Challenges we ran into

The hardest challenge was designing for people whose relationship with their own body is already complicated. Every decision had an ethical dimension. If Liminal tells you your body is shifting, what does that do to someone mid-episode? If Aura tracks your triggers, who else might want that data? We had to build a system that was genuinely useful without being paternalistic, one that returns information without interpreting it, that senses without surveilling. The line between helpful and harmful is thin when the data is this personal. Security was also genuinely difficult to think through. This is not step count data. For Nel, it is the physiological fingerprint of her bipolar cycle. For Priya, eight months of burnout her employer does not know about. The architecture had to make misuse structurally impossible, not just policy-prohibited.

Accomplishments that we're proud of

Honestly, everything. The passion we put into this felt different from any project before. We did not just design a product, we considered every single person who might use it. People with disabilities. People who are neurodivergent. People who are just curious about their own body. Every decision we made had a real person behind it, and we never lost sight of that.

What we learned

We came in knowing the five senses. We left knowing there are 33. That alone reframed everything. Interoception, chronoception, proprioception, nociception, thermoception, the sense of balance, the sense of hunger. The human body is constantly generating information that most of us never get access to. Learning that was the foundation of everything Resonance became. The science did not just inform the project, it changed how we think about what it means to be in a body.

What's next for Resonance

The immediate next step is prototyping the hardware, starting with the Band, which has the most accessible sensor stack. We want to validate whether passive HRV and EDA collection can reliably surface the patterns we designed around. From there: a working version of the Liminal pattern dashboard, real clinical partnerships to test whether sharing Aura data with therapists and psychiatrists actually accelerates treatment decisions, and a proper accessibility review with autistic self-advocates and people with bipolar disorder. Longer term, Resonance is a platform. The three case studies are a starting point. There are dozens of conditions where interoceptive data could change someone's life if they could actually access it. We want to build toward that.

Built With

  • figma
Share this project:

Updates