Inspiration
You're reading this right now with your head tilted forward. Maybe your shoulders are rounded. You probably haven't blinked in the last ten seconds. You didn't notice any of it and that's the problem.
The average desk worker spends 6-8 hours a day in front of a screen, and research shows most of us develop measurable postural decline within the first 20 minutes of sitting down. The damage is invisible and cumulative: chronic neck strain, eye fatigue, tension headaches, and a slow erosion of focus that we mistake for laziness. Every existing solution treats this the same way, a notification that interrupts your work to tell you to sit up straight.
The irony is painful: the app designed to protect your wellbeing breaks the exact flow state it's supposed to preserve. You dismiss it, slouch again in two minutes, and the cycle repeats. We wanted to ask a different question. What if your workspace could feel you struggling before you even noticed? What if instead of shouting at you, it just quietly adapted, dimming the screen, shifting the soundscape, nudging you through a living companion that cares whether you take care of yourself?
What it does
Axis is a bio-responsive desktop app that uses your webcam to monitor posture, blink rate, and facial tension in real-time. Instead of sending notifications, it adjusts your environment, screen warmth shifts, ambient sounds fade in, and a virtual pixel cat companion reacts to your wellbeing. Every session starts with a pixel egg, maintain good posture and focus to fill the evolution bar, hatch it into a cat, and watch it grow through stages as you stay healthy. Neglect yourself and it wilts, curls up, and falls asleep until you take a break. A built-in pomodoro timer and multiplayer leaderboard gamify the experience, letting you compete with friends to see who's the most locked in.
How we built it
Electron with React and TypeScript for the desktop shell. MediaPipe Holistic and Human.js for real-time pose estimation and facial landmark detection, 33 body points and 468 face mesh points processed per frame. Custom scoring engines calculate posture alignment, blink rate, and cognitive load using rolling buffers and hysteresis to prevent score jitter. macOS brightness CLI and CoreGraphics gamma tables for real screen warmth control. Tone.js for layered ambient audio. Pixel art bio-pet with sprite-based animation driven by the health state machine. MapLibre GL for the multiplayer leaderboard visualization.
Challenges we ran into
MediaPipe's WASM backend inside Electron's renderer process kept crashing silently, we initially assumed it was a model loading issue, but after hours of debugging discovered it was a sandbox conflict with Electron's context isolation. Disabling nodeIntegration and configuring the correct CSP headers finally resolved it.
The scoring system went through three iterations. Our first approach used raw joint angles directly from landmarks, but the values jittered so badly the UI was unusable. We moved to exponential moving averages, which smoothed the data but introduced visible lag. The final version combines EMA with 3-second hysteresis windows, scores only commit to a new state after holding steady, giving smooth transitions that still feel responsive.
Ambient control was deceptively hard. Our first attempt changed brightness in discrete steps, which felt jarring even at small increments. We solved this by layering gradual CSS filter transitions on top of native CoreGraphics gamma adjustments, blending them so the shift feels like natural light changing in a room rather than a setting being toggled.
The blink detection initially used a count-based buffer that never pruned old timestamps. If a user blinked 5 times then stopped entirely, those 5 stale entries stayed in the array and the system permanently reported a healthy blink rate. This phantom reading cascaded into the fatigue score, stress estimator, and the pet's health state, everything downstream thought the user was fine when they weren't. Switching to time-based pruning with a 60-second sliding window fixed it immediately.
Accomplishments that we're proud of
The full ambient loop webcam to posture score to screen warmth shift runs in real-time with no perceptible delay. The pixel cat's emotional state transitions feel natural rather than binary. The pomodoro timer integrates directly with the wellness system, treating break compliance as a health signal. And we shipped real system-level screen control on macOS, not just CSS filters in a browser. And despite half the team never having touched Electron or computer vision before this weekend, every member shipped production code from the CV pipeline to the sprite animation system to the ambient audio engine.
What we learned
The human blink rate averages 15-20 per minute and drops to 3-4 during intense screen focus that signal alone is a powerful fatigue detector. Rolling averages and hysteresis matter more than raw accuracy for any real-time biometric UI. And Electron gives you enough native access to build genuinely useful desktop tools, not just wrapped web apps. We also learned that building something you'd actually use yourself changes how a team works. Midway through the hackathon, we caught ourselves instinctively sitting up straighter whenever the cat started drooping on screen and that was before the scoring was even calibrated properly.
What's next for Axis
Multiplayer rooms where teams' pets coexist and collective posture keeps them all alive. Spotify Wrapped-style weekly posture reports. And an open API so any desktop app can become bio-responsive.


Log in or sign up for Devpost to join the conversation.