Inspiration

28 million Americans will experience an eating disorder in their lifetime. One person dies every 52 minutes — the highest mortality rate of any mental illness. One of us has spent the past two years supporting a loved one through atypical anorexia recovery, navigating the daily reality of a body that lies to the person living in it.

During recovery, an ancient survival system — what eating disorder physician Dr. Jennifer Gaudiani calls the "cave person brain" — rewires the body's interoceptive signals. Patients feel full after three bites (that's gastroparesis, not fullness). They feel fine on dangerously few calories (that's cortisol masking starvation). They feel like they ate too much after a normal meal (that's a post-meal autonomic surge). Recovery programs say "listen to your body." But the body has become an unreliable narrator.

Dr. Gaudiani calls these complications "the unmeasurables" — real physical suffering that modern medicine can't easily detect. A patient sees their therapist for one hour a week. The other 167 hours, the cave person brain operates unchecked. There are zero tools that detect interoceptive distortion in real time. We built Intercept to change that.

What it does

Intercept is a speculative interoceptive interface for eating disorder recovery — a peacock feather-shaped shoulder patch paired with a companion app.

The patch reads autonomic biomarkers (heart rate variability, skin temperature, galvanic skin response) and does exactly two things:

  • Warmth — when the cave person brain is quiet, and signals are trustworthy, the feather is gently warm against the shoulder. Comfort. Presence. "I'm here. You're okay."
  • A tap — when hibernation mode activates, a single gentle press into the shoulder. "What you're feeling right now isn't the whole story."

The patient app speaks only in "cave person brain" language — never clinical, never quantitative. It shows when hibernation mode is active, provides context in plain language, and tracks the week qualitatively. No numbers, no scores, no streaks, no calories, no weight. The eating disorder weaponizes data, so we stripped it all out.

The clinician dashboard shows the real data — hibernation episode frequency, HRV trends, post-meal activation duration. A therapist can now say: "I can see that last Tuesday after lunch, your cave person brain went into hibernation mode for two hours. What was happening for you?" That clinical tool does not exist today.

Three use cases:

  1. A patient eats three bites at lunch and feels completely full. The patch detects gastroparesis markers and taps. The app shows: "This is hibernation mode — not reality." She shares this with her dietitian.
  2. It's 2 AM. A patient feels the urge to exercise. The patch reads cortisol-driven restlessness and taps. She screenshots the app for her Thursday therapy session.
  3. A patient has a quiet week. The timeline shows five days of trustworthy signals. She sees — for the first time — that her body is coming back online.

Target audience: Anyone in eating disorder recovery working with a clinical care team — teens through adults, across every ED diagnosis (anorexia, bulimia, binge eating disorder, ARFID, OSFED).

Wellness goal: Rebuilding interoceptive accuracy — teaching the body and brain to trust each other again.

How we built it

We designed Intercept entirely in Figma Make and Figma Design, with presentation materials in Figma Slides.

The clinical foundation comes from Dr. Jennifer Gaudiani's Sick Enough: A Guide to the Medical Complications of Eating Disorders (Routledge, 2019), which documents how starvation distorts interoceptive signals across all eating disorder types.

The physical patch was designed through iterative AI image generation (Gemini Nano Banana 2), exploring over 15 form factor concepts — from abdominal bands to ear-hook devices to rings — before converging on the peacock feather smart-tattoo. Each rejected form factor had a specific design rationale documented in our process.

The UI was built in Figma Make using carefully engineered prompts that enforced our design system: dark navy (#0A1128) backgrounds, Cormorant Garamond serif headings, Inter sans-serif body text, and a strict gold/teal color language representing the two device states.

Design decisions were informed by Laws of UX (Jon Yablonski), Refactoring UI (Adam Wathan & Steve Schoger), The Design of Everyday Things (Don Norman), and Don't Make Me Think (Steve Krug) — specifically the Aesthetic-Usability Effect, Peak-End Rule, Miller's Law, and progressive disclosure.

Challenges we ran into

The harm safeguard paradox. Every feature we considered had to pass one test: "Can the eating disorder use this?" Showing hunger data? The ED says "see, I can ignore that." Showing progress scores? The ED optimizes them like calorie counts. Showing streaks? Perfectionism takes over. We cut more features than we kept. The hardest design challenge wasn't building — it was deliberately choosing what to leave out.

The form factor journey. We explored and rejected 12+ physical designs. An ear-hook device was functionally grounded (auricular vagus nerve stimulation is real research), but ED already feels like a voice in your head — putting a device on the ear reinforces that dynamic. A wrist device felt like "an Apple Watch with extra steps." An undergarment was too difficult to prototype. The peacock feather emerged from the intersection of cultural resonance, visual distinctiveness, and metaphorical precision — the eye sees what you can't.

Learning Figma under pressure. Neither of us had used Figma before this weekend. Building a competition-grade prototype while learning the tool simultaneously — across a weekend that also included a YC Bio x AI hackathon and a nonprofit fundraiser — was the most compressed learning curve we've ever experienced.

Accomplishments that we're proud of

The language system. The strict separation between patient-facing language ("cave person brain," "hibernation mode," "signals trustworthy") and clinical language (HRV, gastroparesis, autonomic activation) is the design decision we're most proud of. It emerged from a single principle: Intercept never gives the eating disorder a sentence it can finish. "You're hungry" — the ED finishes: "and you're weak." But "your cave person brain is in survival mode" — the ED has nowhere to go with that.

The care team requirement.** Making Intercept impossible to use without a clinician code was a deliberate design choice, not a limitation. Eating disorders have the highest mortality rate of any mental illness. A shoulder patch is not a substitute for a treatment team.

The peacock feather: A form factor that is culturally resonant (Krishna, protection, beauty), metaphorically precise (the eye sees what you can't), and radically non-medical. It looks like adornment, not hardware.

What we learned

Designing what to hide is harder than designing what to show. Most apps add features. Intercept's entire design philosophy was removal. Every number, percentage, score, and progress bar we deleted was a deliberate act of protection.

Speculative design requires clinical grounding. Without Dr. Gaudiani's framework, this would have been a wellness concept. With it, every design decision — from the two-output interaction model to the cave person brain language — has a clinical rationale documented across 277 pages of medical literature.

The best UX sometimes means doing less. Laws of UX and Refactoring UI taught us that psychological safety IS the user experience for this population. Beauty builds trust for a user whose body is lying to them.

What's next for Intercept

Clinical validation. Partnering with eating disorder treatment centres to validate whether the cave person brain language framework resonates with patients and clinicians in practice.

Sensor research. Exploring whether HRV, skin temperature, and galvanic skin response measured from the shoulder can reliably distinguish starvation-induced autonomic distortion from baseline — building on existing wearable biosensing research.

Expanded recovery types. The cave person brain framework applies across all ED types. Future iterations would map the specific interoceptive distortion patterns of binge eating disorder, ARFID, and OSFED alongside the restriction and purging patterns already documented.

The ultimate goal: An app that makes itself obsolete. As interoceptive accuracy rebuilds, Intercept gradually reduces its feedback — until the patient no longer needs it. That's recovery.

Built With

  • claude
  • figma
Share this project:

Updates