Inspiration

Pain is one of the most universal human experiences, yet it is still one of the hardest things for medicine to measure. Most healthcare systems still rely on a simple 1–10 pain scale, a system that has remained largely unchanged for decades.

This problem is personal to us. We’ve seen it firsthand, in our own lives and in the lives of our friends and families, where pain was dismissed, misunderstood, or difficult to diagnose. It made us question why patients still have to constantly prove or explain what they feel.

This led us to ask a speculative question:
What if pain didn’t need to be described or believed? What if it could be seen?

That idea became Sygnal, imagining a future where the signals our bodies already produce can be translated into something patients and doctors can understand.

What it does

Sygnal makes pain visible.

Using a wearable biosensor, it continuously reads physiological signals related to pain, such as nociceptor activity, inflammation markers, and stress signals, and translates them into clear visual insights.

The system maps pain onto a 3D body model, tracks where it occurs, how intense it is, and how it changes over time. It logs pain events, highlights peak episodes, and builds a Pain Portrait, a personalized report that helps patients and doctors understand patterns in the body’s signals.

Instead of relying only on subjective descriptions, Sygnal turns pain into measurable, visual information that can support diagnosis and treatment.

How we built it

We began by researching the quantified self movement and sensory systems beyond the traditional five senses. This led us to explore nociception, the body’s biological system for sensing pain.

We mapped how physiological signals could be translated into useful insights for patients and clinicians.

The experience was designed using:

  • Figma for UI design and prototyping
  • FigJam for brainstorming and concept exploration
  • Figma Slides for storytelling

We also used LLM tools like ChatGPT and Claude during research and ideation to explore concepts and iterate quickly.

Challenges we ran into

One of our biggest challenges was the early brainstorming stage. The prompt required exploring sensory experiences that are often invisible, which made it difficult to identify the right direction initially.

Another challenge was translating complex physiological signals into a simple and intuitive interface that patients and doctors could easily understand.

We also experimented with Figma Make and other generative tools. While these helped us explore ideas quickly, achieving the exact results we wanted often required multiple prompt refinements and design iterations.

Accomplishments that we're proud of

We’re proud of designing a system that tackles a deeply human problem—making invisible pain visible.

In a short build window, we created a complete concept including a wearable sensor system, a real-time pain dashboard, a 3D body pain map, historical signal tracking, peak episode detection, and a personalized Pain Portrait for doctor visits.

We’re especially proud of translating complex physiological signals into a simple, intuitive interface that empowers patients to better understand and communicate their pain.

What we learned

This project showed us how powerful speculative design can be for rethinking complex healthcare problems. We learned how many signals the body constantly produces and how those signals could reveal patterns that are currently invisible to patients and doctors.

We also learned how to combine research, rapid prototyping in Figma, and AI tools like ChatGPT and Claude to explore ideas quickly and iterate on the experience.

Most importantly, we learned that good design can make complex biological data understandable, and potentially empowering, for people living with pain.

What's next for Sygnal

Although Sygnal is a speculative concept, many of the technologies behind it are already emerging in wearable health research.

Future directions could include:

  • integrating advanced wearable biosensors
  • improving machine learning models for pain signal detection
  • developing clinical tools for interpreting pain patterns
  • expanding the system to support chronic illness monitoring

Our vision is a future where pain no longer needs to be doubted or minimized, it can be seen and understood.

Built With

  • canva
  • chatgpt
  • claude
  • elevenlabs
  • figjam
  • figma
  • grb
  • make
  • veo2
+ 25 more
Share this project:

Updates