What it does

Our system utilizes Google's Tango and VR libraries to provide 6 degrees of freedom in movement with only a smartphone and a Google Tango tablet. The VR experience is comparable to systems like the HTC Vive, without the hassle of being tethered in place.

Note on submission: This DevPost submission contains only a pdf document outlining our work and design, the code is located on the referenced GitHub Repository. Because of the nature of our project we are unable to provide the entirety of our design even in that repo. This is due to the fact that we accumulated over 1GB of development assets over the course of the project. We have enclosed a README explaining how to properly build our project.

How we built it

By utilizing the motion tracking and depth perception capabilities of the Google Tango tablet and the head tracking capabilities of Google's VR library (the successor to the Google Cardboard) on a smartphone we are able to combine the technologies over a wireless network and accurately replicate the experience of VR without the need for tethering. This is an implementation which effectively inverts the area tracking capabilities that systems like the Vive utilize. By taking advantage of the optical and infrared point cloud that gives Tango it's amazing ability to track spatially from a first person perspective, we are able avoid the cumbersome external-perspective environment setup required of current spacial VR technologies. We accomplished this device combination by retrofitting a harness for the Tango tablet onto a Google Cardboard headset, allowing the two devices to move in coordination with the wearer's perspective.

Challenges we ran into

The Tango isn't powerful enough to render and motion track at the same time, so by adding a smartphone to act as a server and handle rendering, we were able to allow the Tango to simply provide it's service as a spacial motion sensor. This separation of concerns also required us to utilize new (beta) features of the Unity engine to take advantage of new networking capabilities.

We were also challenged with how to submit our demo, which utilizes an amazing amount of free and open-source resources. We succeeded in designing a demo which allows the player to experience the same environment, both utilizing our system and utilizing the HTC Vive as a compare and contrast exercise between the two methods of motion tracking.

Accomplishments that we're proud of

We are extremely excited to have been able to properly implement inside-out tracking in such a way that our design can rival the experience of a traditional motion tracking system.

What we learned

This hackathon has given both of us a wealth of experience in the development process for VR, and the sorts of challenges that hacking both hardware and software simultaneously can have given short deadlines. Discovering how to work together, yet independently, has been one of the best developments for our personal growth as developers.

What's next for Inside Out Tracking for the Brave

We would like to get our hands on more powerful devices capable of rolling the technologies we have stacked on a single independent device, further reducing equipment costs and lowering the bar of accessibility to free-world VR experiences.

Share this project: