Inspiration:

Dancing, Yoga, Workouts, Stretching… today there is an app for almost everything. Regardless of whether your are a bloody beginner and want to learn a good push-up technique or you are an absolute professional and want to perfect your yoga Crow Pose, there is a virtual trainer to show you how it should be done. But what if your execution is incorrect and you are not aware of it? Serious injuries and no progress in your athletic development could be the consequences. Just a virtual trainer might not be capable of saving you from negative effects. Hence we decided to create a trainer who demonstrates the movement in a way, which is more comprehensible for the user.

What it does:

TrainAR creates an augmented reality frame around the user. This frame is a human 3D-Model, which is carrying out Dance Moves, Yoga Poses or Workout exercises. When posing in front of his webcam, the user is able to see himself and the Model on his screen. He then can align himself with the model and follow its movements. While doing so, the user can compare his movements to the model’s, to check if he is doing the exercises in a correct way. The user can choose between several exercises which are grouped into different categories. He can control some settings with his voice for easy access during the workout.

You can watch a demo Video here: https://www.youtube.com/watch?v=X3iSKvuwtw4&feature=youtu.be

How we built it

We've set up our application as web-app. It is build with JavaScript and the Javascript-API of the ARToolkit, which handles the placement of our trainer-figure. For displaying the 3D-animations we've used Three.js and the ColladaLoader. ColladaLoader is able to parse the .dae files and convert them into Three.js-Mesh-Objects. For a convenient interaction with the application we've used the Annyang-Voice-recognition API, which helps us to detect simple commands as "play", "stop", "faster" and "slower".

Challenges we ran into

We struggled finding an easy-to-use ARKit as we don't have an iPhone (and thereby can't use ARKit) and ARCore is not fully released yet. Thereby we had to choose a less powerful AR-framework called ARToolkit. With remote hosting we have quite long loading times as the .dae-files are quite big. We are using the JS-API of ARToolkit and really underestimated the difficulty in finding documentation for this API, which slowed the development process and required a lot of debugging.

Accomplishments that we're proud of

  • Setting up a Techstack we were all not experienced with
  • Understanding and applying frameworks which had nearly to no documentation
  • Squatting in the middle of the night
  • Having loads of fun
  • Producing a great video introduction (you should definitely watch it!)

What we learned

AR is a really fun choice for a hackathon as it's really interactive! (uncalibrated) Webcams are not the best detection devices, but they're still better than you'd think. Also we learned many things about 3D modelling and rapid prototyping.

What's next for TrainAR

Automatically fitting the Trainer to the Users body proportions and position would be a huge plus. Also detecting when the user collides with the Mesh-Outline and warning him visually would be awesome but not realizable in 24hrs with this Techstack. A nice to have feature would be the option of uploading your own animation files.

Try it out

Follow these simple steps to try out our project:

  1. Visit https://tpfeifle.github.io/trainar/index.html
  2. Allow the site to access your camera and microphone
  3. Print out the marker. Alternatively you can try downloading it to your phone.
  4. Select your preferred animation/workout from the lefthand side
  5. Show the marker to the camera
  6. Press the pause button in the upper right, when the animation is at the right position. (alternatively say "STOP")
  7. Optional: Scale/Rotate the figure by using the sliders in the upper right and set
  8. Align with the figure and enjoy your personal trainer!
Share this project:
×

Updates