Explore was inspired by accessible virtual reality and augmented reality tools for mobile devices. Not only do these technologies expand the horizons for consumer-level technology, but they pave the way for the acceptance of such technologies in medical institutions and for industrial data analysis and visualization. The tools let us do more for less.

What it does

Explore is a technological proof-of-concept that we showcase with a number of different implementations. Specifically, Explore is a virtual reality environment for the Google Cardboard (running on Android mobile devices) that leverages computer vision algorithms to learn about the features in its environment. Using this, we Expand the Google Cardboard to track position in addition to orientation.

Explore is at least effective for: -Games -Medical Simulation -Art -Data Visualization

How I built it

The Cardboard is a device that users place on their head in order to experience immersive virtual reality. It works by mounting a phone in front of you and splitting the screen in half, sending each side to one eye. This produces a stereoscopic effect, giving users a 3D effect not that dissimilar to what is experienced at the theatre. The Cardboard then makes use of the on-board gyroscope of your device to provide 360 degree tracking of your head to look around. Not exactly a fashion item, we had no problem cutting a hole in our Google Cardboard so that the camera can peer through. Using the camera, we integrated Cardboard with an augmented reality API: VuForia. VuForia allows us to track features in our environment, getting both rotational and positional information, whereas the Cardboard on its own only registers orientation. Using this technique, the range of possible applications has become much more broad. We then used Unity (in addition to various other asset creation and analytics tools) to create a number of demos showcasing the various uses we envisioned for Explore.

Challenges I ran into

Combing various development ecosystems into a polished product is always a challenge, especially when accounting for performance constraints on mobile devices. Our biggest challenge was certainly fitting everything onto our phones; in a single app, we perform high quality 3D rendering, perform real-time data analytics, and provide immersive, real-time interface all while tracking our environment and doubling the video burden by rendering in stereo. We had to make it work and then do it twice (once for each eye).

Accomplishments that I'm proud of

In our opinion, a weekend of hacking any app together is not likely to produce a polished product that we will be proud of for years to come. As a team, we were most proud of our ability to work well in parallel and use our own strengths and vision to provide unique and complementary solutions and extensions.

What I learned

We come away from events like these not dwelling on what we have learned, but on how much we still have to learn. The best thing we walk away with is the sense of accomplishment that will fuel our future endeavors to learn and create.

What's next for Explore

Explore isn't quite ready for the prime time, but we'll make sure it gets there. We'll be extending Explore into a more natural and user-centric domain with the addition of some tech we're working on and through the exploration and leveraging of modern computer vision techniques.

Share this project: