Inspiration

The inspiration for this project came from a shared frustration with how movement and injury risk are currently evaluated. Most biomechanical insights come from lab-based tools like force plates and motion capture, which are expensive, stationary, and disconnected from how people actually move in the real world. As athletes, we wanted a way to understand how force is applied through the foot during real training, fatigue, and competition, not just in controlled settings. Seeing research that showed foot pressure could reveal gait mechanics and injury risk made us realize that this data was both powerful and largely inaccessible. Our goal became to bring lab-quality force insights into a wearable form factor, turning raw pressure data into actionable, real-time feedback that helps people move better, train smarter, and avoid injury.

What it does

Kinetra is an injury-prevention and performance intelligence hardware and software platform built on high-resolution plantar pressure and force data. Instead of simply displaying raw pressure heat maps or basic activity metrics, Kinetra analyzes how mechanical load accumulates across the foot over time, detecting asymmetries, fatigue-driven breakdowns, and overload patterns that are strongly correlated with common overuse and impact-related injuries. This allows users to understand not just what movements they are performing, but how their bodies are responding to sustained physical stress.

Sometimes, just foot pressure data isn’t enough. To truly integrate an immersive coaching experience, we implemented overshoot.ai. When recording an epsiode users have an option of starting an AI stream. This stream collects information on how the exercise is being performed, giving specific details that foot pressure won't capture. In any analysis later, our AI coach can then use this information to provide user-specific guidance.

Kinetra’s AI Coach delivers this insight in real time and post-session. During activity, the system can proactively alert users when fatigue or asymmetry emerges, signaling when it may be time to slow down, adjust technique, or recover. After workouts, the AI Coach reviews sessions and provides clear, actionable feedback on load management, movement efficiency, and injury risk, helping users make informed training decisions rather than relying on intuition alone.

Kinetra is designed for a broad athletic population, from beginners learning proper form, to competitive and professional athletes optimizing performance, to older active individuals who face elevated injury risk due to fatigue and recovery limitations. By translating complex biomechanical data into practical guidance, Kinetra empowers anyone who moves with intent to train smarter, stay healthier, and reduce the risk of preventable injuries.

How we built it

Hardware: Kinetra's sensing system is built around a 85-node piezoresistive pressure matrix (13 rows × 9 columns) embedded within a 3D-printed active foaming TPU insole for a men's size 9 shoe. The matrix consists of orthogonal conductive thread traces separated by a Velostat layer, a pressure-sensitive polymer whose resistance decreases under load. These components are embedded during printing by custom g-code at specific layers, creating a completely passive, flexible sensing layer with no rigid electronics within the footbed itself. Internal structures prevent thread-to-thread shorting while distributing heat and pressure evenly across the matrix.

The system is powered by an Adafruit Feather Sense nRF52840 microcontroller (featuring integrated BLE and 9-axis IMU) housed in a rigid heel-mounted clip alongside a 16-channel analog multiplexer (CD74HC4067), pull-down resistor array, and Li-Po battery. The sensing logic operates via voltage divider principles: each row is sequentially driven high (3.3V) while the multiplexer reads corresponding column voltages through the MCU's ADC, allowing complete matrix traversal in milliseconds. Pre-loading the conductive threads is critical, testing confirmed that adequate compressive force stabilizes baseline resistance and maintains sensitivity, with pressure-induced resistance changes ranging from 20Ω (heavy load) to 15-20kΩ (light contact), well within the system's detection range when paired with appropriately sized reference resistors.

The system was designed with the intent to be fully manufactured from scratch within a 4.5 hour window.

Software:

The software stack was designed for real-time data flow, multimodal analysis, and fast iteration during a hackathon. A React frontend handles live visualization, session control, and user interaction. Sensor data is streamed from the device via BLE to the browser and forwarded over WebSockets to a Flask backend, which serves as the central orchestration layer. The backend computes biomechanical statistics, runs PyTorch models for inference, and manages session state. All workout “episodes” and derived metrics are stored in MongoDB for replay and post-session analysis. For richer context beyond pressure data, we integrated overshoot.ai to perform video, language model inference during recorded sessions. Finally, the AI Coach is powered by LiveKit, enabling agent-based reasoning, workout awareness, and real-time text-to-speech feedback. On the modeling side, we built two core ML pipelines. The first is an exercise classification model that operates directly on plantar pressure time series. Pressure frames are encoded spatially with a CNN, temporally modeled with a GRU, and passed through an MLP to classify the current exercise in real time. This allows the system to automatically detect movement type without manual input. The second is a 3D lower-body reconstruction model that predicts leg joint coordinates in 3D space using only shoe-mounted sensor data. This model was trained via knowledge distillation from a vision-based teacher model that had access to video. The student model uses a CNN → LSTM → MLP architecture to learn a compact, shoe-only representation of human motion, enabling kinematic estimation even when cameras are unavailable.

Challenges we ran into

One major challenge was achieving a high refresh rate while streaming dense pressure data over Bluetooth. Bandwidth limitations forced us to implement aggressive compression and quantization, requiring careful tradeoffs between latency, signal fidelity, and reliability. Tuning this pipeline to remain stable across devices was nontrivial. Another challenge was data collection. Training meaningful biomechanical models requires large volumes of high-quality, labeled movement data, which is time-intensive to collect, especially within a hackathon setting. We had to rely on a combination of limited real-world sessions, synthetic augmentation, and heuristic-driven features to bootstrap useful models under tight time constraints.

Accomplishments that we're proud of

One of our proudest accomplishments was building and delivering a fully integrated system end to end, rather than a disconnected demo. We owned everything, from designing and fabricating the 85-sensor pressure-sensing insole and embedded firmware, to streaming data over BLE, running real-time ML inference, and delivering live AI coaching feedback in the browser.

We’re especially proud of building meaningful biomechanical models in a low-data regime. Instead of relying on large labeled datasets, we designed architectures and training strategies that could extract robust signal from limited, noisy plantar-pressure data. Using strong inductive biases, temporal modeling, and knowledge distillation, we built models that classify exercises, detect asymmetry and fatigue patterns, and infer lower-body kinematics using only shoe-mounted sensors. Achieving stable, real-time performance across hardware, software, and ML under tight constraints demonstrated that lab-grade biomechanical insight is possible in a wearable, real-world form factor, even with minimal data.

What we learned

We learned how to integrate two core real-time services, LiveKit and Overshoot, into a single coaching workflow that feels responsive during training and useful in post-session review. LiveKit taught us how to build voice-agent experiences so users can talk to an always-available coach and receive meaningful reflections on their exercises. Overshoot taught us how to add richer context beyond foot pressure by capturing how an exercise is performed, enabling rapid technique feedback and highlighting potentially injury-prone movement patterns. As a team, we also learned how to design reliable real-time pipelines under tight constraints: streaming sensor data, synchronizing multimodal inputs, and turning noisy signals into actionable cues.

What's next for Kinetra (Smart-Insole Technology)

What’s next for the Kinetra sole is a focused push toward refinement, scale, and real-world robustness. On the software side, the priority is collecting larger and more diverse datasets across different users, activities, and fatigue states to further fine-tune our models and improve the accuracy of injury-risk detection and coaching insights. As more data is gathered, Kinetra’s algorithms will continue to learn personalized baselines, enabling more precise, individualized feedback over time. On the hardware side, future iterations will move toward a fully embedded, self-contained sole, integrating sensing, power, and processing directly into the insole form factor to improve durability, comfort, and usability. Together, these advances will allow Kinetra to transition from a powerful prototype into a seamless, everyday system capable of delivering continuous, research-grade biomechanical insight in real-world conditions.

Built With

Share this project:

Updates