'Mirror city' immersive geo-spatial demo
GIF of geo-spatial demo, demonstrating real-time filtering
'Mirror city' demo, used to visualize telecom data (dropped calls)
Snapshot from our Magic Leap IoT sensor data demo
Highlighting the IoT sensor data using gesture and user location.
It all started with Star Trek. As a kid, I always wanted my own holodeck to escape the boredom of growing up. When the Oculus DK1 developer’s kit came out in 2013, I finally got a glimpse of that dream becoming a reality. This opportunity made me want create my own VR/AR business. I’d also been struggling at the time with designing a product that involved visualizing multiple large, nationwide datasets. The two concepts - VR and data visualization seemed like a natural fit. If I could add more dimensions to data, I’d solve almost ALL of the challenges I was facing. This insight was the initial inspiration behind BadVR.
However, at the time, immersive hardware was not ready for wide-scale adoption. So the idea for the business was tabled, and continued working with Jad (now my co-founder) on multiple other (2D) data visualization projects.
Finally, the market and hardware caught up, and Jad and I founded BadVR. We immediately dived into building our dream - a world where data is no longer flat, boring, or 2 dimensional!
What It Does
BadVR uses immersive technology to make the process of visualizing and analyzing data fundamentally faster, more effective, and accessible to non-technical users. Our core value proposition focuses on solving the universal pain users feel when working with their data. By bringing data to an immersive environment, users can dramatically increase the amount of data they're able to analyze, understand, and navigate, while simultaneously lowering the cognitive load required to perform this analysis.
Immersive environments add value to data in multiple ways. Human brains are inherently set up to process 3-dimensional data, so we naturally find it easier to interact with data that’s presented in this format. Recall is dramatically increased, and pattern recognition (and important part of data analysis) is also enhanced, allowing users to more quickly discover - and act on - valuable insights previously hidden in their data.
How We Built It
My co-founder Jad and I partnered to build BadVR’s initial demos. Jad is a programmer; I am a UX/product designer. We would both brainstorm the value of different datasets, exploring to see what publicly available datasets we thought would benefit the most from being visualized.
After setting on a dataset, I’d dive into the use case. I'd ask myself: What insights would someone be searching for within this data? What parameters would they find value in manipulating? I’d meet with users in that industry vertical and interview them to find out what attributes of the dataset were most meaningful to them, and what answers they were looking to find within it.
From there, I’d get creative and brainstorm ways to visualize the data, while making sure that I still met customer needs and added value to the process of visualizing and analyzing the data. Once I settled on an idea, I’d share it with Jad, who’d give me technical feedback about how to perhaps alter the designs a bit to increase system performance or technical feasibility.
Then, I’d draw up rough wireframes on pieces of paper (not many good tools exist for prototyping in immersive spaces!). I’d then translate these drawings into a 3D Tiltbrush experience that I’d share with Jad. Once he understood the concept, he'd begin coding in Unity. Oftentimes, I'd sit right beside him like a co-pilot, giving him my real-time feedback as he built the demo.
Usability in immersive (VR / AR) products is really early-stage right now, so we really had to spend a lot of time trying things, failing, and doing it all over again. In some small way, I hope we’re contributing to the creation of a standard set of UX & product design best practices for immersive products! Jad and I have certainly learned a lot about how to prevent users from feeling disoriented or sick, which is the unfortunate affect of poor usability in immersive spaces.
Beyond usability challenges, we struggled with prototyping and design because there are no standard Sketch or InVision equivalent tools for doing UX / product design for immersive products. I’d often have to draw out complex 3D environments on flat pieces of paper, then use objects in our office to bring the scene to life and communicate the desired interaction between world and user. I felt more like the director of an imaginary movie in my head than I ever felt like a proper UX designer! ;)
Jad also struggled a lot with the technical challenges of spatializing extremely large datasets. How do you see a million, or a billion, of anything? Beyond the design challenges of presenting this to the user, there are also extreme technical and engineering challenges. I’ve been very impressed with his unique strategies for overcoming these challenges, some of which are now part of our IP.
Both Jad and I are very proud of overcoming the intense challenges we encountered working with enormous datasets in immersive spaces. Building for a new technology like AR or VR in and of itself is not easy - but add working with extreme-scale datasets and re-imagining them for immersive environments and you’ve got yourself a VERY meaty challenge! Both of us are proud that we’ve been able to produce 3 demos in less than 6 months - 2 VR demos visualizing geo-spatial and financial data, and 1 AR demo on the Magic Leap visualizing IoT sensor data.
As a UX / product designer, I’ve very proud to have overcome the operational challenges that resulted from the lack of standard design tools. It’s not easy to be a UX designer with no Sketch or InVision! The lack of tools for UX and product design is also coupled with a lack of best practices for usability; the whole process really needs to be rebuilt from the ground up. I’m proud of tackling that challenge head on and still being able to design products that met customer needs, delivered a lot of value, and that didn't make anyone sick!
What Was Learned
While going through the process of designing and building for immersive platforms, we learned that you really need to utilize the enormous amount of space that immersive platforms afford you. It’s easy to port 2D workflows into immersive spaces by adding flat screens everywhere, but that underutilizes the amazing potential of immersive technologies! Properly using this additional space and taking the time to rethink every standard for 2D products really gives your immersive projects the opportunity to add true value to the end user.
People perform workflows very differently in immersive, 3D spaces. You have to design and build products that fit in with the ways users naturally move and work in such environments. Forcing users to return to 2D workflows in 3D spaces is frustrating and lessens the value they get out of the product.
For the BadVR team, we’ve learned to maximize the space and really spend the time exploring how users will move and work within it. We’ve also learned a lot about how to design those small, micro-interactions (like how long a text overlay should linger on an item after the user has interacted with it, and whether this text overlay should follow the user’s position if they move, and if it does follow, how long of a delay you should add, etc) so that they are done in a way that feels natural.
In short, we’ve learned a lot about how avoid making people motion-sick!
What's next for BadVR?
We’re continuing to build demos, service companies in our pilot program, and look for more businesses interested in revolutionizing their data visualization and analytics potential!
And - as always - we continue to fundamentally reimagine what it means to visualize and analyze data - eliminating one pie chart, line graph, and bar chart one immersive data experience at a time!