When you think big, you might think of our Solar System, or maybe even our galaxy - but our cosmos is far, far more vast than that. Our home galaxy, the Milky Way, hosts hundreds of billions of stars, enough gas to make billions more, and five times more mass in something mysterious, called Dark Matter. At least trillions - that is millions of millions - of galaxies fill up the visible Universe. Together, they form the structure you see now, called the Cosmic Web.
Some galaxies, like our Milky Way, live in groups with a few dozen partners. The Milky Way is on a collision course with our nearest neighbour, Andromeda. Don't worry, the collision isn't due for another 4 billion years! But when it happens, it will be spectacular. Groups live in long chains called cosmic filaments, which eventually feed the largest groups, called galaxy clusters.
What it does
Look around you. Navigate your way through filaments, groups, and individual galaxies. When you find an interesting object, give it a little attention. If you look long enough, you'll get to zoom in and see it in much more detail, and learn more about that type of object. What do you see? How do the stars, gas and dark matter live in relation to each other?
Every point you see in this experience is a real data point from the Illustris cosmological simulation. This simulation started with a Universe merely 51 million years old and runs it all the way to the present day, at age 13.7 billion years. Astronomers know that in the early days, the Universe was largely very smooth, just one big hot soup of protons, electrons, light particles, and dark matter. If one part of this soup is a little denser, it attracts other parts to it by gravity and grows even denser over time; regions that were less dense in the beginning grow emptier and emptier. As gas gets denser, some of it forms clumps that eventually form stars, and the Universe starts to light up. Finally, by the present day - which we show - the large scale structure looks like a giant web, called the Cosmic Web. The zoom-in at the end takes you to high-resolution data on a single galaxy.
This simulation thus shows our best understanding to date of how the Universe evolves on the largest scales. Urmila published her first academic paper on measuring dark matter distribution in large groups of galaxies (galaxy clusters) by looking at the light from the stars in the cluster galaxies, using this very simulation! Dozens of papers have used this simulation to understand what exactly are the predictions of our current model of dark matter and dark energy, and how these predictions compare to observations. Now, anyone can experience this.
We tried to design our experience around the user. We created personas that we wanted to target the experience towards.
Then for each user, we mapped out there "User Journey" through our application. We tried to imagine what would be important at each stage for each user.
Our conclusion from this study is that we wanted to primarily target an audience like "Bob" - someone who doesn't have a lot of technical knowledge already but is interested and wants to have an exciting experience! We also considered expanding this in the future to target researchers like Alex as a secondary audience who might use this for further data analysis that isn't possible in traditional mediums.
Based on our user story conclusion we tried to design an experience that would fit our primary audience. Part of this was designing a fun and "cool" heads-up-display that would make the user feel like they're in an astronaut's helmet. Part of this design choice was also to counteract the possibility of a nauseating experience caused by flying in space. We also wanted to implement a system on changing helmets to see different types of information such as gas and stars. These designs were first sketched on paper, which is shown below.
Next, the heads-up-display design was implemented using Inkscape and imported to Unity.
We downloaded snapshots from the openly accessible Illustris simulation (www.illustris-project.org/data). Like most astrophysical simulations, this was in HDF5 format. We used Python to select the information we wanted to visualize - density of dark matter and gas, the colors of star particles, and positions for everything - and convert it into .xyz files for Meshlab (http://www.meshlab.net/). Meshlab converted these into PLY files, and we found an open-source library to load these into Unity. This gave us our first view of the Universe in 3D!
We then used the Vive SDK to allow users to fly through the simulation using the controllers.
Challenges we ran into
Loading the particle data into Unity was a huge challenge! We first considered reading in JSON files and making a ParticleSystem, but couldn't figure out how to do this. The Meshlab workaround really saved the day.
Rendering the particles was also a challenge. The dataset was so large and we were worried about performance and frame-rate. Luckily we found and utilised a public project that had a custom mesh renderer to visualise a large point cloud (https://blog.sketchfab.com/tutorial-processing-point-cloud-data-unity/).
We seem to be having a Unity sound driver issue, so that we can't hear any sounds playing from Unity. We have been working with a mentor to debug this, but unfortunately ran out of time. We would like to expand the interactions to support grab-and-move style locomotion, more advanced selection of objects (galaxy groups, clusters, etc.) using both hand and eye controllers, and an in-depth HUD. For this project we developed a simple "fly-around" movement system using the Vive hand controller triggers.
Accomplishments that we're proud of
We thought a lot about the user experience and pedagogical content. We are really proud that we could show an accurate data-set without compromising performance.
We are also really proud that our team (all strangers before this weekend) came together and utilised our diverse skillsets to create a unique and engaging experience! We come from a variety of backgrounds - Astronomy, Biology, UI Research, Design, and Software.
What we learned
Beste, Diego and Paul learned about extragalactic astronomy (including the term extragalactic astronomy), the large scale structure of the Universe, and how astronomers simulate and observe the various components of the Universe. Urmila learned to work with HDF5 files instead of relying on other people's highly specialized wrappers. We all got a really deep dive into Unity and the Vive SDK, including how to load in and render large datasets, how to move around the space and various ways to trigger actions. We quickly gelled as team members and worked like a well-oiled machine!
What's next for CosmoVR
We want to add a lot more sound and information to the app to make it a powerful educational tool for consumers at various levels, ranging from curious school kids to scientific researchers. We also want to support more zoom levels and the ability to dynamically change scale without loading new scenes. Since a typical dataset in this space has different time slices we'd also like to visualise how these particles move and change over the course of billions of years. We'd also like to continue implementing our interaction points.
One future work will be to try and port this experience to a standalone headset and prove out the feasibility on a reduced performance hardware platform (like the Oculus Go or GearVR). We believe even a simplified port to a standalone headset could increase adoption in museums and schools who don't want to spend excess resources to drive an HTC Vive.
https://assetstore.unity.com/packages/3d/environments/sci-fi/sci-fi-platforms-3d-models-83091 (not used in final project) https://assetstore.unity.com/packages/2d/textures-materials/sky/free-hdr-sky-61217 (not used in final project) https://assetstore.unity.com/packages/2d/textures-materials/sky/urban-night-sky-134468 https://nasa3d.arc.nasa.gov/detail/helmet
Unity Libraries and Projects
http://wiki.unity3d.com/index.php/SimpleJSON (not used in final project)