We are building a yoga platform because yoga has helped our families get out of depression. As a side effect, it has made us more flexible. Throughout COVID-19, people are required to social distance and loneliness has become a big problem. We want to empower instructors to be able to produce better quality content and allow people to do yoga at home, and if possible, with friends in aid of creating community and battling loneliness.
What it does
We are building a live stream yoga class web application. What makes our app special and unlike other live streaming apps is we are using A.I. pose tracking and stick figures to provide a feedback loop from teachers to users. This way students are able to see each other, and instructors can view all of the students. Tiger graph in the backend helps to do fast analytic tools for instructors so they can see the run their classes better
How I built it
We used TigerGraph and GSQL for data analytics. Exporting firebase data directly into tigergraph. We have created 3 Verticies and 5 different edges for the hackathon itself. Lesson, User and Instructor. In which edges include users being friends with each other, user attending a class, user giving feedback to a class, teachers teach a class and users can follow the teachers. We've also created additional GSQL to help facilitate to tools.
We used Agora’s Real Time Engagement Video SDK. Then we are running TensorFlow A.I. pose detection on top, once we get the skeleton points, we can then draw the stick figure through Augmented Reality. Since you can’t inference on top of the HTML video element, We did this is by creating a canvas to redraw all the livestream, then run the inference on top of the canvas itself to draw the detection. After the detection is done, we then draw the stick figure through AR overlay on top of user’s live feed video in real time.
We are also giving choices for users to either join the public channels, their own private channel or create a channel for their friends to take the yoga class together. The instructors will be subscribed to all the channels. This way students can protect their privacy from other students while still allowing teacher to guide them. Because we are using Agora SDK across all platforms, the Android user can actually now see the web users and vice versa, with instructors seeing everyone indistinguishably.
Challenges I ran into
Getting A.I. to run on top of live video feed from Agora’s Video SDK proved to be a little more difficult than we thought, but we were able to solve the problem by redrawing the video feed onto a canvas then doing the inference on top of the canvas itself.
GSQL was another challenge, luckily learning another tool to learn, the detailed step by step experience is being documented at https://www.hackster.io/364351/how-to-use-tigergraph-for-analytics-e476fa
We are writing down our AI solution on https://www.hackster.io/mixpose/running-ai-pose-detection-on-top-of-agora-video-sdk-d812ce Another challenge is some users don’t really want to turn on their camera, so we created a private mode trying to accommodate their privacy concerns via Agora’s SDK.
Accomplishments that I’m proud of
We’ve launched web app on https://mixpose.com and we are now testing it with actual users. This is much more scarier because we want to ensure our users have the best experiences using our application.
Another accomplishment we are very proud of is that we actually have the license to use the music in the demo video :)
What I learned
GSQL for the first time, and running graph SQL becomes really powerful
What’s next for MixPose Web
We are ready to take this idea forward and turn it into a startup. 3 of us Co-Founders have quit our jobs to working towards it full steam ahead.