Inspiration

We were inspired by our teammate Caden's irredeemable posture and figured this would be a fun way to correct the posture of developers everywhere.

What it does

Our project uses computer vision and ml/ai models to identify if a subject is hunched over their workspace in a way that is unergonomic. A live posture score and video feed of the subject are complied and sent to a website for viewing. Additionally if the posture is deemed to be bad enough then a motor spins a pool noodle to strike the subject and remind them to sit up straight.

How we built it

This was built in 3 key components.

First is the python that is using opencv (an opensource computer vision library) and mediapipe (a google ML/AI set of solutions for many computer vision applications) to identify the pose of the subject and calculate the angle between their left shoulder and their left ear to determine if they are in a slouched position. Once this determination has been made a score, video feed and on/off signal to the motor are broadcast as a response to the users posture.

Next is the frontend. the front end is built with react and existing to hold goodies for the viewers enjoyment like a live video feed of the subject and a score rating the subjects posture, both of which are received from the python backend.

Last is the Arduino/robotics aspect which is made up of an Arduino uno connected to a motor and a pool noodle. that receives either an on or off signal depending on the subjects posture and spins up to correct bad posture as needed.

Challenges we ran into

Our original idea was to attach a string to a helmet and tug on the subject to remind them that they were slouching but after messing around with all of the motors we had plus some gear ratios we were unable to produce a system that had enough power to pull the subjects head back. This forced us to pivot to a pool noodle slapping technique at the last minute which was quite a challenge to pull off.

connecting our python script up to the Arduino was quite difficult as none of us have done any robotics recently so there was quite a bit of pain syncing up a python script and an Arduino as they have very different ways of timing the code they run. This lead to lots of confusion as we tried many different ways of relaying the posture info to the Arduino in order to trigger the mechanical reminder that we set up.

Accomplishments that we're proud of

We are extremely proud of the efficiency this hackathon. We completed basically all that we wanted without a ridiculous amount of crunch or sleep deprivation! We have participated in other events like this before and not been able to complete them as comfortably as we did with this one.

What we learned

We learned how to do computer vision as none of us had ever done that before. We also learned how to integrate python with both a front end robotics at the same time which is really cool.

What's next for Evil Posture Corrector

Our plan for the future is to add the original posture yanker design once we get our hands on some more powerful motors.

Share this project:

Updates