About

Apple's live translation feature in iOS 26 showed us that real-time audio processing for social contexts was not only technically feasible but something Apple was already building toward. If a device could translate spoken language on the fly, it could just as plausibly detect the emotional and structural dynamics underneath that language — the pauses, the overlaps, the hesitation patterns. That precedent gave Ripple a credible technical foundation and pushed us to think about what else a device already in your pocket and on your wrist could quietly perceive on your behalf.

Ripple is a speculative conversation dynamics tracker that detects invisible social and conversational signals and translates them into discreet haptic cues through an Apple Watch. It operates in two modes — Social Cue Mode, which tracks emotional availability, engagement, and confusion in others, and Conversation Flow Mode, which tracks turn-taking, interruptions, topic shifts, and pacing. After every session, the iPhone app surfaces a visual timeline and radar chart so users can reflect on patterns and build long-term communication awareness.

We designed the full system across two surfaces — iPhone and Apple Watch — using Figma for static screens and Figma Make for interactive prototypes. The live session screen was built as a functional demo with a real-time timer, auto-firing signal events, and haptic flash animations. The radar chart was custom built as an SVG using a four-point spider grid. The Apple Watch interface was designed around a haptic-first philosophy — every alert is felt before it is seen.

Challenges we ran into

One of our biggest challenges was designing for a sense that doesn't technically exist yet. Without real sensor data to work with, every signal type had to be grounded in behavioral research and human intuition rather than actual input — which meant constantly asking whether what we were designing was genuinely useful or just speculative for its own sake. Designing the haptic language was harder than expected. A tap, a pulse, and a double tap sound distinct on paper but needed to feel distinct in practice — learnable without a manual, subtle enough not to embarrass someone mid-conversation, and meaningful enough to actually change behavior. Getting that balance right required a lot of iteration. We also struggled with the line between helpful and intrusive. Ripple sits very close to surveillance territory — it listens, it logs, it quantifies other people's behavior without their explicit knowledge. Making sure the design centered user agency and consent at every touchpoint, rather than treating it as a legal checkbox, took deliberate effort. Finally, replicating the feel of a live, sensor-driven system inside Figma Make required creative workarounds. Simulating real-time signal events, a live timer, and haptic flash animations in a design tool not built for that kind of logic pushed us to think like engineers as much as designers.

Built With

  • figma
  • figmamake
Share this project:

Updates