Inspiration

We wanted to build a cool, interactive wearable device that could help with fitness and turn boxing practice into something we could play and compete with friends. We knew that by leveraging physical sensors and Machine Learning, we could go far beyond basic step-counters or motion detectors to actually classify complex strikes. While we focused on perfecting the "Jab" for this iteration due to hackathon time constraints, our ultimate vision was to create a smart, gamified boxing assistant.

What it does

Our project is a real-time, wearable AI punch tracker. The physical device is strapped to the user's dominant arm, where it constantly monitors the physics of their movement. Using a custom-trained Machine Learning model, it accurately detects the exact moment a user throws a "Jab" versus when they are just resting ("Idle"). When a jab is thrown, the AI also calculates the raw power of the punch and instantly beams that data to a sleek desktop interface that displays a dynamic power meter.

How we built it

The hardware prototype consists of an Arduino Nano ESP32 wired to an IMU (gyroscope and accelerometer) and a battery pack, secured to the wrist. The microcontroller constantly reads the 6-axis motion data and streams it over Bluetooth Low Energy (BLE).

On the software side, a Python script running on a laptop listens to the live Bluetooth feed using a threaded queue architecture. We built a custom sliding-window algorithm that acts as a 1.5-second "memory buffer." This buffer extracts the user’s hand motion data and inputs it into an already trained Random Forest Machine Learning model. Finally, a custom Tkinter GUI catches the AI's predictions and visualizes the punches in real-time.

Challenges we ran into

Bridging the gap between hardware and AI is incredibly difficult. We faced several RF transmitter and Bluetooth connectivity issues, having to debug ways to maintain a stable, high-speed data stream without the OS dropping packets.

However, our biggest hurdle was Machine Learning data collection. We initially tried to use a pre-existing dataset, only to realize the data was flawed and synthetic! We had to completely pivot, build our own data-collection pipeline, and manually physically throw punches to generate and label our own dataset. We learned the hard way that AI is only as smart as the data you feed it, and balancing the dataset to prevent the model from "overfitting" was a massive challenge.

Accomplishments that we're proud of

We are incredibly proud of building a custom Machine Learning model entirely from scratch. Going from a chaotic stream of numbers to an algorithm that successfully uses gyroscope data to understand the complex pattern of human hand motion was a huge win. Building the entire end-to-end pipeline, from soldering the physical hardware, to Bluetooth streaming, to signal processing, to training the AI, and finally building the UI, felt like a massive technical achievement.

What we learned

Our biggest "Aha!" moment was learning how to tame real-world physics. We learned firsthand how chaotic and noisy raw accelerometer data can be. We had to learn how to write signal processing scripts that used peak-detection (looking for massive 15 m/s^2 acceleration spikes) to slicing through the noise to extract clean, feedable data windows for our AI to learn from.

Next steps for the project

Now that the core hardware and AI pipeline is completely functional, the next step is scale. With more time to physically generate and label diverse data, we plan to expand the AI's "brain" to detect a full repertoire of strikes, including hooks and uppercuts. Ultimately, we want to upgrade the software to act as a digital coach, using ML to give users post-session feedback on their speed, stamina, and form over time.

Share this project:

Updates