It's 42 minutes to deadline, what can I say? It's been quite a journey.

Inspiration

As a developer in a somewhat alien world of 3D artists, it's often hard for me to do stuff that's natural to artists - pull vertices, place monsters in 3D space on a flat screen, spend hours refining what machine has missed. Anything that brings that virtual world into reality and blends it makes my life easier - all of my life experience is in 3D and it's only natural to edit 3D projects like that too, no? Because of that I've built byplay.io - an AR mobile tool for video tracking. It's already something, but in mobile AR there's still tiny phone between virtual and real. With MR the edge is finally blurred to almost disappearing, but now it's on us developers to let other people enjoy the magic.

I want to bring virtual to reality for people that arguably use it the most - filmmakers and cinematographers. They're very visual people and they care about details, and often they don't want to deal with complicated tech and wires and weird people on set. So the idea of using $500 headset as an ultimate on-set vis tool is just breathtaking. Imagine: actors see a "real" werewolf jump on them, rehearse with this and act much better. How much more fun CGI shootings can become

What it does

  • Puts 3D models (including animated), allows user to precisely place them. This is already a big deal because filmmakers see the "real" scale and motion of the model right on set thanks to MR
  • Simulates camera, records video using Scene Mesh to reconstruct background, the camera records mp4 video which is saved to Oculus Files app. A cinematographer without any 3D skills alone can plan shot, set camera, record it and send to their colleagues
  • Reflects user motion using Motion API, user's body tracking is used to animate an avatar and thus they can participate in the shot as an actor or one of the "monsters"
  • Facilities import by running an HTTP server on Quest that runs a web page where users can upload 3D models. Also super easy to integrate as an API with 3D software

How we built it

In Unity, with Oculus SDK. Nothing too exciting here.

Accomplishments that we're proud of + challenges

  • Recording video from Unity. This is a non-trivial task that's not described anywhere. Even though I've done it already for Byplay Android app, adapting it to Oculus involved several hours of low-level graphics headache
  • Precision selecting. When user places several objects, they want to be able to point at one and select it to move/delete. Normally selection is done with colliders, but they're not super precise - and are just horrible for animated objects. I cannot tell a film director that there's an invisible box that they need to aim at. So instead I'm placing an additional camera on the controller, and that camera renders each object with specific unique color. I'm sampling center of the image and do lookup of the object. Works like a charm, very proud of it. Maybe it's a standard procedure but somehow I've not encountered it
  • Alembic import. The app also supports import of Alembic - a pro format developed by Sony. Official Unity library only supports it for desktop models, but I managed to build C++ project for Android by replacing some libraries

What we learned

  • Android in the Oculus flavor is surprisingly pleasant to develop for!

What's next for Byplay Space

We're in talks with a VFX supervisor on a major project for a huge streaming service to use the app to see 3d monsters on set for actors and directors. The app is new so they haven't tried it yet but are very enthusiastic

Built With

Share this project:

Updates