Inspiration

We’ve all had those study or work stretches where time vanishes, posture collapses, and suddenly you’re exhausted without noticing it happening. For people with ADHD, hyperfocus can act like noise-canceling on fatigue—until it snaps and leaves you drained.

WorkSleep grew out of that lived frustration: we wanted a gentle, private nudge system that spots the early signs of burnout before they snowball. Instead of guilt or productivity pressure, it gives you permission to pause—right when your brain wouldn’t ask for it on its own.

We focused on keeping everything on your device and making the logic transparent so it feels like a helpful teammate, not surveillance. The goal is simple: protect focus by protecting recovery, one smart break at a time.

What it does

WorkSleep watches for subtle signs that you’re getting tired—like drooping eyes, drifting focus, a slouch in your posture, small head nods, or simply forgetting to blink. When your overall fatigue score passes a certain level, it doesn’t just warn you—it gently locks the screen with a calming overlay, giving you space to take a real break. Over time, it learns your personal rhythm and adjusts the length of your breaks with a clear formula (duration = scaler × weighted_score), so your rests feel natural and helpful, not random or preachy.

How we built it

We combined OpenCV and MediaPipe (FaceMesh and Pose) to track facial and posture cues in real time, all processed locally for privacy. The system calculates four fatigue indices—eye openness, attention drift, slouching, and yawning—then stores them in a lightweight SQLite database alongside session data. The Tkinter overlay handles the screen lock during breaks, and our learning system blends past behavior with live data to adapt instantly. Even new tasks can “borrow” knowledge from previous ones through semantic similarity, so the app starts smart instead of starting over.

Challenges we ran into

Detecting fatigue isn’t simple—what looks like deep focus for one person might signal exhaustion for another. Lighting, camera angle, and user movement made early detection inconsistent, and ADHD users added another layer: some needed short “microbreaks,” while others required longer recovery time. We refined our thresholds, stabilized posture detection for angled cameras, and added real-time learning so the system adapts within minutes instead of weeks. Most importantly, we made every trigger explainable—so users always understand why a break happens.

Accomplishments that we're proud of

We created an assistant that feels like a partner, not a watchdog. The logic is easy to understand, everything runs locally for privacy, and the adaptive scaling makes breaks feel personalized instead of forced. New tasks automatically learn from older ones, head-nod detection is accurate across camera angles, and we added safeguards to prevent back-to-back triggers. The best part? When people ask what it does, you can explain it in one sentence—and that simplicity builds real trust.

What we learned

Tiny numerical decisions (like shifting the “fully open” EAR threshold) can drastically change user trust, so tuning had to be empathetic, not just statistical. Transparency turns friction into cooperation; users embrace enforced pauses when the trigger path (indices → weighted_score → scaler × duration) is obvious. Personalization needs two tempos: fast blend-at-trigger for responsiveness and slower scaler evolution for stability. And consistent naming (scaler vs. legacy timer_coefficient) matters more than we first admitted—clarity compounds.

What's next for WorkSleep

We want WorkSleep to live on your phone so it can spot eye strain and drowsy patterns from the front camera and gently nudge you to take a real break from the screen. The idea is a lightweight, on-device companion that respects privacy and timing—just-in-time reminders to look up, walk around, and reconnect with the world.

In parallel, we want to train robust fatigue models from multimodal signals (blink cadence, gaze drift, head pose dynamics), using privacy-preserving pipelines so the data stays yours. The goal isn’t a medical device yet, but research-grade screening that can support clinicians and coaches with interpretable features rather than black-box scores.

Longer term, we imagine a glasses-based companion that runs edge inference on subtle stress and fatigue cues—blink rate, micro head nods, posture drift—and offers gentle biofeedback to help you choose less stressful paths in the moment. Think of it as a calm, wearable guide that learns how your body reacts to certain stimuli and helps you make decisions that protect your energy, not just your calendar.

Built With

Share this project:

Updates