Inspiration

I spend so many hours hunched over a laptop that “sit up straight” becomes white noise. I wanted a coach that lives next to the webcam, gives gentle nudges in real time, and doesn’t require extra hardware.

What it does

Posturize calibrates my neutral head angle in three seconds, then watches for changes with the laptop camera. It streams mirrored video with pose landmarks, flags when I drift, and plots head tilt over time so I can see when fatigue kicks in.

How we built it

The backend is Flask, OpenCV, and MediaPipe Pose. It captures the webcam, computes the nose angle every frame, and streams MJPEG video annotated with landmarks. I log posture samples once per second in memory and expose them over REST. The frontend is Vite + React with D3 for the trend chart, GSAP for transitions, and a custom WebGL cursor to match the aurora styling.

Challenges we ran into

Getting the mirror correct was a saga, flipping the video without flipping the math took a lot of late-night debugging. MJPEG streaming through a plain tag also exposed quirks in caching and error handling. Finally, tuning posture thresholds so “look up” vs “look down” felt intuitive required several iterative tests.

Accomplishments that we're proud of

I’m proud that the app feels polished despite being built in a weekend. The smoothness in entry animation, tidy bento box layout, responsive chart, and steady calibration workflow, all without any external wearables or server-as-a-service band-aids.

What we learned

Computer-vision UX lives or dies by latency and tiny interaction details; even small visual feedback delays break trust. I also learned that MediaPipe’s landmark coordinates make geometric math straightforward once you normalize everything, and that commenting code for future demos pays off.

What's next for Posterize

I’d like to detect full-body posture, add stretch reminders, and push summary reports at the end of the day.

Built With

Share this project:

Updates