Inspiration

We are building a yoga platform because yoga has helped our families get out of depression. As a side benefit, it has made us more fit and flexible. Throughout COVID-19, people are required to social distance and loneliness has become a big problem. We want to empower instructors to be able to produce better quality content and allow people to do yoga at home, and if possible, with friends in aid of creating community and battling loneliness.

Gym, Yoga studios are considered high risk areas, which is affecting a lot of small businesses and people's jobs. We want to digitize the yoga and fitness SMB by providing them tools to create the best quality content, and thus will provide the best experience for the users.

What it does

We are building a live stream yoga class on Android, iOS and web application. What makes our app special and unlike other live streaming apps is we are using A.I. pose tracking and stick figures to provide a feedback loop from instructors to users. This way students are able to see each other, and instructors can view all of the students. Phase 1 will include us hosting the classes to stabilize the platform, phase 2 will allow yoga teachers to run their own yoga businesses on our platform.

How I built it

We used Agora’s Real Time Engagement Video SDK. Then we are running TensorFlow A.I. pose detection on top. Once we get the skeleton points, we can then draw the stick figure through Augmented Reality. Since you can’t inference on top of the HTML video element, We did this is by creating a canvas to redraw all the livestream, then running the inference on top of the canvas itself to draw the detection. After the detection is done, we then draw the stick figure through the A.R. overlay on top of user’s live feed video in real time.

We are also giving choices for users to either join the public channels, their own private channel or create a channel for their friends to take the yoga class together. The instructors will be subscribed to all the channels. This way students can protect their privacy from other students while still allowing the instructor to guide them. Because we are using Agora SDK across all platforms, the Android user can actually now see the web users and vice versa, with instructors seeing everyone indistinguishably.

Challenges I ran into

Getting A.I. to run on top of live video feed from Agora’s Video SDK proved to be a little more difficult than we thought, but we were able to solve the problem by redrawing the video feed onto a canvas then doing the inference on top of the canvas itself. We have documented our solution on https://www.hackster.io/mixpose/running-ai-pose-detection-on-top-of-agora-video-sdk-d812ce

Another challenge was that some users don’t really want to turn on their camera, so we created a private mode to try to accommodate their privacy concerns via Agora’s SDK.

Accomplishments that I’m proud of

We’ve launched web app on https://mixpose.com and we are now testing it with actual users. This is much more scarier because we want to ensure our users have the best experiences using our application.

After pushing the Android Minimum Viable Product into Google Play, we were able to receive our first few purchases.

Another accomplishment we are very proud of is that we actually have the license to use the yoga music in our classes and we are using it in the demo video. :)

What I learned

A lot, from how to stream high quality video to how to teach yoga instructors on how to use latest software. Other things we learned were how to train A.I. models for yoga poses, and how users are much less forgiving when something goes wrong, especially if they are paying.

What’s next for MixPose Web

We are ready to take this idea forward and turn it into a startup. 3 of us Co-Founders have quit our jobs to working towards it full steam ahead. Try out one of our class at https://mixpose.com

Built With

  • tensorflow
Share this project:

Updates