Inspiration

As a result of a pandemic that literally stopped the world in its tracks, human beings are spending more time than ever at home, sitting in front of screens. Several studies have shown that the more time we spend sitting, the more likely we will suffer from a shorter and unhealthier lifespan (Benatti & Ried-Larsen, 2015; Eanes, 2018; Korshøj et al., 2018; Lurati, 2018; Perdomo, Gibbs, Kowalsky, Taormina, & Balzer, 2019). Additionally, the more time we spend in front of our electronic devices, the less empathetic we become, and the more susceptible we are to developing mental health issues (Baker, Coenen, Howie, Williamson, & Straker, 2018; Berryman, Ferguson, & Negy, 2018; Liu, Bao, Huang, Shi, & Lu, 2020; Torales, O'Higgins, Castaldelli-Maia, & Ventriglio, 2020; Zhai & Du, 2020). There is a serious health disadvantage to this immediate and automatic transition of our routines to a predominantly digital life and our project seeks to address this gap. Our proposed solution to our current sedentary and stagnant lifestyles is a mental and physical break from our work/study schedules - BreakMotion. BreakMotion is a game that promotes an active 1- to 5-min break from long Zoom meetings and work/study sessions. Get off your seats and challenge your co-workers or classmates to get more squats in than you in record time.

What it does

BreakMotion tracks body movements using a pre-trained motion detection model for a minimum of 2 participants. These participants are challenged to perform an exercise task such as squatting or doing high kicks as a break from prolonged sitting. The game counts how many squats, for example, each participant does and determines the winner of each round based on the higher count number. Then, there is a ranking system for all participants involved in the BreakMotion session. At the end of a long day or week of online meetings, one participant is declared the ultimate FitMaster and gets all the bragging rights so deservingly won.

How I built it

We used a pre-trained 3D ResNet for Human Activity Recognition, trained on the Kinetics dataset. We extended 3D ResNets model pretrained on Kineticts-400posted (CVPR 2018), which typically uses 2D kernels, to instead leverage 3D kernels, and enabled it to implement human activity recognition by OpenCV library and the Python programming language.

Challenges I ran into

The training process of 3D CNNs and ImageNet (even on a few motions) is extremely time-consuming and requires a combination of multiple GPUs/CPUs which is impractical for a 24-h hackathon. The pre-trained model offered by AIST is trained by Kinetics dataset with 400 classes. It misrecognized some exercise motions like always recognizes jumping jacks as hula hooping.

What's next for MAIS HACK 2020 - BreakMotion

  • Incorporate a more diverse range of exercises into the game
  • Integrate the game into a platform like Zoom, for example, for break sessions in breakout rooms
  • Use wrnchai API to detect other complex human movements

Built With

Share this project:

Updates