Inspiration

We wanted to make a tool that learned from our natural movement in order to create a tailored, more engaging workout experience for the user.

What it does

Our app helps you have a more engaging and supportive workout via use of a custom ML model that tracks your body's movement and automatically records your reps and workout progress.

How we built it

We first recorded motion data of us doing body weight exercises (push-ups, squats, and jumping jacks) using the accelerometer and gyroscope inside an iPhone and pair of Airpod Pros. We then utilized Sci-Kit learn to train an ML model that would be able to recognize the exercise from raw movement data streamed to an iPhone fitness assistant app. We then used coremltools to convert the outputted model to a Core ML model that was compatible with the Swift app.

Challenges we ran into

When recording our rep counts per exercise, we had problems with identifying the start and end of each rep, but we were able to refine it through threshold experimentation and trial and error.

Accomplishments that we're proud of

We were satisfied that we were able to combine our love for fitness with the challenges theme of collecting non textual data for AI and machine learning.

What we learned

While we knew that the iPhone would have a rich amount of sensing capability, we did not expect the Airpod Pros to have both a gyroscope and accelerometer, which were actually the key pieces to this hack because it gave us two sources of movement data and which allowed us to identify exercises with a high degree of accuracy

What's next for Movement?

We think the capabilities of combining off the shelf consumer hardware like the iPhone and Airpod Pros could serve as a cost effective way to capture data for a variety of purposes beyond just fitness such as remote health diagnostics, physical therapy, and more

Built With

Share this project:

Updates