Inspiration

Our inspiration was my own horrible posture; I noticed how most people in university, myself included, suffer from bad posture. As a result, over time you start looking like a shrimp and start to suffer from health complications. To fix this, we created this app to help ourselves and fellow peers.

What it does

Our project takes camera input from your webcam and checks your sitting posture in real time while you're typing or coding. It detects when your posture deviates from an ideal, upright position, such as slouching, forward head tilt, or rounded shoulders, and gives on screen feedback. It goes from red to white, red being poor posture and white being excellent.

How we built it

We built No More Shrimp in Python, combining Mediapipe Pose to detect 33 body landmarks with high accuracy, even during subtle movements, and OpenCV to handle webcam input, display the posture skeleton overlay, and render live feedback. At its core is a custom adaptive analysis engine that calculates angles between the neck, shoulders, hips, and head using strict thresholds for better sensitivity. The app includes a dynamic Heads Up Display (HUD) that shows posture scores, real time metrics, and actionable recommendations, along with support for multiple camera view angles such as front, side, and back with adaptive scoring for each. We also added a session tracker that logs posture scores and alerts throughout a user’s work session and saves everything as JSON files with metadata, making it easy to load past sessions and track improvements over time. Along the way we learned a lot about working with Mediapipe and OpenCV, figuring out how to extract and use landmark data effectively, design a UI that is clear but detailed, and handle real time analysis without slowing things down. We also gained experience designing adaptive feedback and working with JSON to store structured session data in a way that is easy to load and review.

Challenges we ran into

When we were building No More Shrimp, we ran into a few challenges along the way. Getting the angle calculations right was tricky; figuring out how to measure the neck, shoulders, and hips in a way that worked across different people and camera setups took some trial and error. We also noticed that certain angles, like side views, were a bit harder to assess consistently, but we focused on making the front-facing experience as accurate as possible. Learning how to use MediaPipe and OpenCV effectively was another big part of the process, especially working with the landmark data and making sure the visuals updated smoothly in real time. Finally, designing the Heads-Up Display (HUD) took some experimentation. We wanted it to show clear, useful feedback without cluttering the screen or overwhelming the user.

Accomplishments that we're proud of

We’re proud of how our project turned out, especially in how detailed and responsive it feels. We put a lot of effort into getting the landmarks and angle calculations right, with a focus on tracking the movement of the back and shoulders accurately. Along the way, we learned how to work with new APIs like Mediapipe and OpenCV and how to use them effectively for front-end development, real-time analysis, and live visual feedback. We also made sure the system provides continuous posture monitoring, records useful session data, and presents it through a clear and adaptive UI that adjusts to different camera angles and delivers comprehensive, actionable feedback in an easy to understand way.

What we learned

While building No More Shrimp, we learned a lot about working with Mediapipe and OpenCV, figuring out how to pull landmark data, calculate meaningful angles, and keep everything running smoothly in real time. We also got better at turning all that raw data into feedback people can actually understand and designing a UI that’s clear and easy to use without being overwhelming. We picked up some good lessons about making the system adapt to different camera angles and user movements, and we even learned how to save and load session states using JSON with helpful metadata, so users can see their posture history over time.

What's next for No More Shrimp

For No More Shrimp, we see plenty of room to grow. We want to improve the UI and UX to make it even more user friendly and visually polished. One of our next goals is to revisit how we handle side posture and come up with a better, more reliable way to detect and analyze it. We also want to improve how the system tracks progress over time, helping users see their improvement and giving them more meaningful feedback. Another idea is to turn it into a standalone application, so it’s easier to install and use outside of a development environment. Finally, we’d love to add a feature that gives users practical tips and tricks on how to improve their posture based on their specific habits and patterns.

Built With

Share this project:

Updates