Inspiration
As dedicated MMA enthusiasts and product people—Barry a Product Designer in Auckland, and Jamie a Product Manager in Brisbane—we were obsessed with the process of improvement. By day, we build products that improve lives, and by night, we apply that same passion to our training. We were constantly hitting a wall. We had fitness data from our wearables, but nothing could answer the most important question: "Is my technique actually getting any better?"
We saw the incredible, data-driven analysis happening at elite institutions like the UFC Performance Institute and felt the massive gap between the pros and the rest of us. Our inspiration was born from that frustration: we wanted to use our skills in product development to build a bridge across that gap, making elite-level sports science accessible to any athlete with a smartphone.
What it does
Pivot Labs Fit is a digital performance institute designed to be every martial artist's personal biomechanics coach. For this hackathon, we built a functional prototype of our core feature: the AI Biomechanics Analyst.
Our web application allows a user to upload a video of a specific movement (like a squat or a jab). The app then uses a computer vision model to analyse the technique, providing objective, data-driven feedback on key performance indicators. Instead of guesswork, the athlete gets a clear breakdown of their form, highlighting areas for improvement such as joint angles, movement path, and stability. This prototype is the first step towards our larger vision of a complete performance platform that tracks an athlete's skills, conditioning, and recovery.
How we built it
We built Pivot Labs Fit as a full-stack web application, leveraging AI-powered tools to accelerate our development across the Tasman Sea.
- AI-Assisted Development: We used an AI development agent,
bolt.new, to rapidly scaffold the entire project. This allowed us to generate the backend, frontend, and database connections from natural language prompts, saving us days of setup and letting us focus on the core logic. - Frontend: The dashboard and user interface were built with React and styled with Tailwind CSS for a clean, responsive, and modern feel. Data visualisations were created using Chart.js.
- Backend: A simple Node.js server with an Express framework handles video uploads and the analysis requests.
- AI & Computer Vision: The core of our analysis engine uses a pre-trained pose estimation model (TensorFlow.js with MoveNet) to extract key joint coordinates from the uploaded video in real-time. We then wrote custom logic to process these coordinates into meaningful biomechanical KPIs.
- Deployment: The entire application is deployed on Vercel, providing a live, shareable URL for our prototype.
Challenges we ran into
Building a complex application remotely across two countries, while juggling full-time jobs and family, was a significant challenge.
- The Human Element: The biggest challenge was the personal grind. For Jamie, being a father to two young kids, meant development time was a precious resource, often found late at night after the family was asleep. Coordinating across the Tasman required rigid discipline and constant communication to stay in sync.
- Technical Complexity: Translating raw data from the pose estimation model (a stream of x,y coordinates) into scientifically valid and easy-to-understand feedback was a major hurdle. We spent a significant amount of time researching biomechanics to define the right KPIs and build the logic to calculate them accurately.
- Defining "Correct" Technique: Without access to a multi-million dollar sports science lab, we had to be pragmatic. We defined the "gold standard" for our initial movement analysis by using a combination of academic research, expert coaching tutorials, and footage of elite athletes.
Accomplishments that we're proud of
Despite the challenges, we're incredibly proud of what we've built in such a short time.
- A Functional End-to-End Prototype: We didn't just create a slideshow; we built a working, deployed application that proves the core concept is viable.
- Integrating Complex AI into a Simple UI: We successfully took a complex piece of AI technology and wrapped it in a clean, intuitive user interface that provides real, tangible value.
- Validating Our Core Idea: The prototype demonstrates that it's possible to get objective, biomechanical feedback using just a smartphone, which validates our
Built With
- bolt.new
- framer
- react
- supabase

Log in or sign up for Devpost to join the conversation.