nspiration The Silent Epidemic of the Digital Age. As developers, we are the architects of the future, yet we are physically decaying in the present. We spend 8+ hours a day locked in a "C-shape" slouch, slowly eroding our spinal health.

Existing solutions failed us in two ways:

Wearables: Require charging, are uncomfortable, and look awkward in public.

Software Nags: Apps that blast notifications or force a "mirror" camera window on screen, breaking flow and inducing self-consciousness.

We asked: What if health monitoring was invisible? We wanted to build a "Ghost in the Machine"—an AI that lives silently in your toolbar, respects your privacy by processing locally, and uses empathy (a reactive 3D companion) instead of annoyance to guide you back to alignment.

What it does UrBackBuddyAI is a privacy-first, stealth health monitor that transforms your webcam into a biomechanical sensor—without ever showing you a video feed.

Invisible Analysis: The app runs in the background, analyzing 17 key body points at 60FPS using a lightweight AI model.

The Emotional Proxy: Instead of boring charts, your health is visualized as a 3D Robotic Companion.

Good Posture: The robot hovers happily, glows green, and looks around with curiosity.

Bad Posture: The robot shivers, turns red, and looks down in distress.

Granular Vector Scoring: We don't just say "Sit Up." We calculate a 3-axis vector:

Neck Score: Tracks Ear-to-Shoulder alignment (Text Neck).

Shoulder Score: Tracks Vertical Symmetry (Leaning).

Spine Score: Tracks Nose-to-Hip verticality (Slouching).

How we built it We architected a high-performance Hybrid Native App to balance the raw speed of Rust with the reactivity of the Modern Web.

The Core: Built on Tauri (Rust), giving us native OS control and a memory footprint 90% smaller than Electron alternatives.

The AI Engine: We utilized TensorFlow.js (MoveNet Lightning) running on the WebGL backend. This allows us to perform real-time inference directly on the user's GPU, ensuring zero data leaves the device.

The Visuals: We procedurally generated the 3D avatar using Three.js (@react-three/fiber) and Spline, creating a reactive environment that responds to state changes in milliseconds.

The Architect: The entire codebase was developed inside Google's Antigravity IDE, with Gemini 3 Pro acting as our Lead Architect—helping us solve complex vector math and Rust memory safety challenges in real-time.

Challenges we ran into The "Zombie Camera" Paradox: Even when minimizing the app, the webcam's hardware light remained on, destroying user trust. We had to engineer a surgical Stream Cleanup Protocol in React's lifecycle, iterating through media tracks to kill hardware access the microsecond the component unmounts.

The 60FPS State bottleneck: Running AI inference at the monitor's refresh rate (60Hz) caused React to re-render the UI 60 times a second, freezing the application. We implemented a Throttled State Engine that updates the internal logic at 60Hz but only paints the UI at 10Hz, keeping the experience buttery smooth without losing data precision.

Accomplishments that we're proud of True Stealth Mode: Successfully implementing a detection pipeline where the element is technically active but visually non-existent (opacity: 0), creating a truly "magical" user experience.

Local-Only Privacy: We achieved 100% on-device processing. We can proudly say that not a single pixel of video data is ever transmitted to a server.

Gemini-Powered Math: Using Gemini 3 Pro to derive the ArcTan2 formulas for our vector scoring system allowed us to achieve clinical-grade angle detection in under 2 hours.

What we learned Empathy > Data: Users ignore red text warnings, but they react to a sad robot. We learned that anthropomorphizing data is the most powerful way to drive behavioral change.

WebGL Resource Management: We learned the hard way that TensorFlow and Three.js fight for the same GPU context. managing GL context loss and restoration was a masterclass in low-level graphics programming.

What's next for UrBackBuddyAI We are moving beyond a hackathon prototype to a sustainable, medically-backed health platform.

Enterprise Edition (₹100/mo): We are launching a lightweight subscription for remote teams, offering SQL-based historical analytics and trend reports to help companies reduce RSI-related burnout.

Medical Validation: We are initiating a research study with local Orthopedic Surgeons to calibrate our vector algorithms against clinical standards, aiming to make UrBackBuddyAI a certified tool for physical therapy.

Smart Home Ecosystem: Developing an API to integrate with Philips Hue, turning your physical room lights red when you slouch for too long—creating an immersive, unavoidable feedback loop.

Audio Intervention: A background service that automatically pauses Spotify/YouTube when bad posture is detected, forcing a "micro-break" until alignment is corrected.

Built With

Share this project:

Updates