A key aspect of scoring in basketball is free throw shooting. Despite it occurring so frequently for players of all positions in every game, the shooting stance and accuracy that players have, even at the professional level, varies greatly. We wanted to come up with an app solution that is accessible to the average person and that helps people improve and find their personalized, ideal shooting forms.

What it does

Our smartwatch app utilizes the accelerometer and gyroscope sensors in smartwatches to constantly read in tracking data to tokenize an individual's shooting form. Initially, users train their data by taking free throw shots and keeping track of which shots were makes versus misses. With this data, we are able to use features engineering and random forest decision trees to determine which features of the watches' tracking data, like acceleration or tilt at different time steps, is most directly associated with makes. On each successive free throw, our app is able to inform the user in real-time of what aspect of their shooting form most strayed from the optimal model. This feedback can analytically help them improve their shot.

How we built it

Because the device we had to test on was an Apple Watch, we wrote our app in Swift. To access the sensor data, we leveraged the Core Motion library. We stored the real-time stream of data from the sensors in an online PostGresQL server. We wrote a PHP script with our Swift app code to allow the app to write data to the database. We read data from the database in real-time to a python script, in which we use sci-kit learn and other machine learning/classification libraries to improve what we regard as our optimal model, which we compare with the data that constitutes each new shot.

Challenges we ran into

We had trouble initially reading sensor data from the app. Then, we realized that we needed a way, when the app is collecting sensor data in the background, to determine whether or not a user was in the middle of taking a shot. Setting up the MySQL database was fairly smooth, but integrating it with our app was more difficult.

Accomplishments that we're proud of

The way that our app is able to determine the difference between a shot of good form and bad form is entirely based on a machine learning model that is personalized for each individual user, and for the task they are trying. However, the same code that we wrote to perfect the shooting of a free throw can be directly applied to any other activity that leverages sensor data available in smartwatch. We considered possible implications an app like this could have in even more critical fields. Imagine a surgeon being able to practice a mechanical medical procedure with analytical feedback in addition to conventional guidance. The possibilities are endless and we couldn't be more excited.

What we learned

We learned the basis for iOS and Apple watchOS development, how to set up MySQL databases and host them online and accessible across multiple sources, and how to connect large streams of data with Machine Learning techniques dynamically.

What's next for Release

We intend to improve our modelling process and extend the application of our app to other processes that can be tracked through commonly accessible wearable sensors.

Share this project: