KeyEase — Our Story

What Inspired Us

Burnout doesn't announce itself. It accumulates quietly — in the tightness of a deadline, in the emails you draft and delete, in the evenings that don't feel like rest anymore. We've felt it personally. That slow erosion where you're technically functioning but something underneath is running too hot, for too long.

The frustrating part is that most stress tracking tools require you to already know you're stressed. You open an app, you log a mood, you strap on a wearable. All of which require awareness and intention at exactly the moment you have the least of both.

We kept coming back to one question: what if the device you're already using — the one that's already touching your hands for eight hours a day — could notice before you do?

That's where Undercurrent came from.


What We Learned

The Science Was Already There

We didn't invent the idea that stress leaves a physical trace in how you type. Researchers at Carnegie Mellon and the University of Illinois had already done that work. What surprised us was how robust the signal is. Inter-keystroke intervals, finger contact area, touch pressure, movement jerk — these aren't subtle correlations. Under stress, the numbers shift in ways that are statistically significant and physiologically grounded.

The math behind the core stress index uses an exponential moving average to smooth the raw signal:

$$S_t = 0.8 \cdot S_{t-1} + 0.2 \cdot r_t$$

Where $S_t$ is the smoothed stress index at time $t$ and $r_t$ is the raw score from the current 60-second window. This keeps the model responsive without overreacting to a single hard keystroke.

The multimodal fusion model weights four independent physiological pathways:

$$\text{StressIndex} = \sum_{i=1}^{n} w_i \cdot \sigma_i$$

Where $w_i$ is the pathway weight and $\sigma_i$ is the normalised deviation from personal baseline. Keyboard dynamics alone reach $\approx 71\%$ accuracy. Trackpad alone $\approx 68\%$. Combined, they exceed $89\%$.

macOS Has the Data — But Doesn't Make It Easy

The trackpad signals we needed live inside MultitouchSupport, a private Apple framework. It exposes rich data — finger contact area, pressure in grams, ellipse angle, major and minor axes — but using it means distributing outside the App Store entirely. No sandbox. Direct .dmg distribution with Sparkle for updates. A meaningful technical constraint that shaped every product decision.

Figma Make Changed How We Prototype

We built the entire visual language — five stress states, the notch UI, the breathing guide, the sound player, the analytics dashboard — entirely in Figma Make before a single line of production code was written. The speed at which we could validate a feeling, share it with the team, and iterate was unlike anything we'd used before. Prototyping at this fidelity used to take days. It took hours.

Storytelling Is a Design Tool

We spent as much time on Neil's story as we did on the interface. The video script went through four versions. What we kept learning was that the framing of the product — your laptop already knows — only lands if the audience has felt the thing we're describing first. Story creates the felt sense that makes the technology make sense.


How We Built It

We worked across three parallel tracks:

Research & Science — Deep dive into peer-reviewed literature on keystroke dynamics, neuromotor stress biomarkers, and multimodal biosensing. We mapped ten signals across three sources into a weighted fusion model with personal baseline calibration.

Design & Prototyping — Full product built in Figma Make. Five colour worlds responding to five stress states. A notch UI with eleven distinct frames. A constellation-based trackpad calibration flow. A breathing guide, sound player, and reach-out feature — all living in 126px of screen real estate.

Narrative & Presentation — A video script built as a short film. A character. An arc. A moment of stress, a moment of calm, and an honest look at what the data shows afterwards.


The Challenges

Making the Notch Feel Like a Guest, Not a Surveillance Camera

The earliest version of this concept had the notch proactively telling the user they were stressed. It felt invasive. Clinical. Wrong. The reframe that unlocked everything was simple: the notch doesn't diagnose, it invites. It doesn't say "you're stressed." It says "take a moment" and leaves a door open. The user walks through it or they don't. Either is fine. Getting that tone right — in the copy, in the visual weight, in the timing of the expansion — took longer than any single technical problem.

Scoping for a Hackathon

KeyEase could be a lot of things. A clinical neuromotor health tool. A workplace wellness platform. A consumer sleep and recovery app. At a hackathon, that breadth is a liability. We had to make a deliberate choice: one user, one evening, one emotional arc. Neil, his email, his boss, his fifteen minutes of recovery. That constraint made everything sharper.


What's Next

The passive sensing model is ready to be built natively. The science is validated. The design language is defined. The story is clear.

The only thing left is to ship it.

Built With

+ 3 more
Share this project:

Updates