Inspiration
We saw that most workout trackers lack real-time feedback and adaptive guidance, so people have poor form or stalled progress. We wanted a “person" trainer that’s able 24/7—no more excuses about timing or needing someone else for posture checks.
What it does
AI-Powered Posture Detection: Live video analysis (MediaPipe + OpenCV) flags form errors and counts reps in real time. Personalized Journaling: Logs weights, reps, and performance history. Intelligent Recommendations: Uses your past workout data to suggest next-session exercises and weight scaling.
How we built it
Backend: FastAPI with MongoDB (Motor) for asynchronous session tracking and journals. Frontend: React + Vite serving a single-page app that connects to our API. Computer Vision: TensorFlow Lite XNNPACK delegate + MediaPipe Pose for low-latency form analysis.
Challenges we ran into
Tuning the CV pipeline for smooth, lag-free feedback at 30 FPS. Manage camera sessions and cleanup so streams don’t throw the “session already in progress” error. Designing a predictive journaling algorithm that feels both safe and motivating.
Accomplishments that we're proud of
Achieved stable, sub-100 ms pose inference on a standard laptop webcam. Built end-to-end flow: from video capture through AI inference to database storage and UI. Delivered a polished demo in under 24 hours during our hackathon.
What we learned
The trade-offs between model accuracy and real-time performance in browser/node environments. Best practices for writing async CRUD endpoints in FastAPI and handling long-lived camera I/O. How to structure a simple React/Vite project for rapid prototyping.
What's FitForm. Aii
Expand this library beyond bicep curls (e., squats, planks, lunges). Mobile app version for on-the-go feedback. Advanced AI coaches: integrate more sophisticated ML models that adapt messaging based on user progress. Scalability improvements: auto-scale backend and add social/community features
Log in or sign up for Devpost to join the conversation.