Inspiration

Pain is the body's most critical warning signal but it is completely invisible to everyone except the person feeling it.

We kept coming back to one question: what happens when the person feeling it can't tell you?

A baby crying at 2am. A dementia patient who lost language months ago. A trauma victim in the ICU, sedated and unresponsive. These aren't edge cases, they're millions of people, every day, suffering from pain that goes undetected because our only tool for measuring it is asking.

The research stopped us cold:

  • Adults identify the cause of a baby's pain cry at just ~54% accuracy, statistically indistinguishable from random chance (Corvin et al., Current Biology, 2022)
  • Between 50–80% of dementia patients experience daily pain that goes unrecognized and untreated (Mayo Clinic Health System, 2023)
  • Over 30–50% of ICU patients develop delirium, completely blocking pain communication (PMC Critical Care Review, 2021)

We realized we weren't solving a tech problem. We were solving a human communication gap that medicine hasn't been able to close. That became SOMA.


What it does

SOMA is a soft biometric wearable paired with a mobile app that detects, translates, and visualizes pain signals in people who cannot communicate them.

The wearable continuously reads:

  • Skin Temperature: Checking for sudden heat or cooling in a specific area, which can indicate inflammation or distress.
  • Heart Rhythm (HRV): Measuring the tiny variations in time between heartbeats. When you're in pain, your heart rhythm usually becomes more "rigid" or less variable.
  • Sweaty Palms (GSR): Measuring electrical changes on the skin caused by moisture. It's a classic sign of the body's "fight or flight" stress response.
  • Restlessness: Tracking unusual or jerky movements—basically, seeing if the person is squirming or thrashing more than usual.

These signals are fused on-device and mapped to a 3D body model in real time, producing a pain heatmap that any caregiver can read at a glance.

Key features:

  • Body Map Home Screen, live severity heatmap on a 3D human figure
  • Signal Detail Cards, pain type, confidence %, timestamp, suggested action
  • Context Tagging, log triggers (after activity, during sleep, stress-related)
  • History Timeline, every incident logged, timestamped, named by region
  • Device Dashboard, live telemetry: heart rate, galvanic skin, battery, signal
  • Semantic Zoom, four levels from macro region → exact tissue

How we built it

We built SOMA as a speculative design project, working across concept, research, UX, and prototype simultaneously.

Concept & Research We grounded every design decision in peer-reviewed literature. The three core stats that shaped the product came from Current Biology, Mayo Clinic Health System, and Frontiers in Pain Research. We mapped nociception as a human sense one of 22–33 documented senses and designed SOMA as an extension of that sense outward, from self to other.

UX & Prototyping The app was designed and prototyped in Figma Make with a custom dark theme (black background, teal #00B4A6 accent). We built five core screens: Home Body Map, Signal Detail, Log Details, History Timeline, and Device Status.

The 3D body model uses a .glb asset rendered with Three.js + GLTFLoader, with raycasting for tap-to-region detection and a warm heatmap overlay ($\text{yellow} \to \text{orange} \to \text{red}$) mapped to signal severity.

Physical Device While we love to build out a working prototype but due to limited time we cannot afford to do that


Challenges we ran into

Mapping pain to anatomy accurately Pain doesn't follow clean body-part boundaries. A signal in the lower left abdomen could be muscular, digestive, or referred. We had to design a confidence system rather than a certainty system SOMA presents interpretations with percentage confidence, not diagnoses.

Avoiding false authority The biggest ethical design challenge: SOMA must feel helpful without feeling like a doctor. Every detail card uses language like "detected signal consistent with muscle strain" rather than "you have muscle strain." Getting that language right took many iterations.

Semantic zoom on a 3D model Implementing four zoom levels (Macro → Meso → Micro → Diagnostic) on a rotating 3D figure without overwhelming the user required careful information architecture. Too much detail too early breaks trust; too little makes the app feel useless.

The caregiver vs. self-user split Our three primary use cases, baby, elderly parent, and self-monitoring adult, have fundamentally different interaction models. We had to design one interface that serves all three without feeling generic to any of them.


Accomplishments that we're proud of

  • Grounding every feature in published, peer-reviewed research rather than assumptions
  • Designing a UI that communicates severity instantly, the heatmap reads the part that cause user pains.
  • Building a concept that extends a real documented human sense (nociception, sense #8 of 22–33) rather than inventing fictional biology
  • Creating a product that addresses three distinct user populations, infants, elderly, and adults, with a single unified interface
  • Completing a full speculative design cycle: HMW question → research → PRD → prototype → pitch in one session

What we learned

Pain is under-designed for. The entire medical industry's approach to pain assessment in non-verbal patients is behavioral observation, nurses watching faces and body language. There is almost no technology in this space despite a $\$635B+$ annual economic burden from chronic pain alone (National Pain Foundation, 2024).

Speculative design requires real constraints. The most useful creative constraint we gave ourselves was: every sensor must exist today. GSR patches, temperature arrays, HRV monitors, none of this is science fiction. SOMA is a systems design problem, not an invention problem.

Confidence framing changes everything. Showing $84\%$ confidence instead of a binary yes/no completely changes how users relate to the app. It invites them into a collaborative interpretation rather than a verdict.


What's next for SOMA, nociception sensor

Clinical validation, Partner with pediatric hospitals to run controlled studies comparing SOMA's biometric detections against nurse-assessed FLACC pain scores in non-verbal patients.

Caregiver mode, A dedicated interface for parents and professional caregivers with shared monitoring, push alerts, and trend reports across multiple patients.

Pattern learning, A personal baseline model per user. After $n$ observations, SOMA learns your normal and flags deviations, not just absolute thresholds:

$$\text{Alert} = \begin{cases} 1 & \text{if } S_t > \mu_{personal} + 2\sigma \ 0 & \text{otherwise} \end{cases}$$

Referral integration, When a pattern crosses a clinical threshold, SOMA generates a structured pain report exportable directly to a GP or specialist, turning passive detection into actionable medical documentation.

The long vision: A world where no one, not a baby, not a dementia patient, not a trauma survivor, suffers from pain that the people around them couldn't see.

Built With

Share this project:

Updates