There are millions of applications of VR, from something as simple as nature exploration, to something as complex as an RPG. However, so far, there have been many limitations in the VR space. Previously, we've never seen a large-scale open world game on VR. Due to the limitations of what can be stored and processed on a VR device, large scale and complex maps are difficult to work with in VR (I've tried...). We wanted to develop such an application since for true immersion, we need true openness. To overcome this obstacle, we turned to procedural generation.
What it does
Our Unity project essentially uses procedural generation to generate terrain, which can then be influenced by the user, based on keyboard input.
How we built it
The entire project was built in Unity. We start by using a Perlin noise generator which becomes transformed into height data, for generating terrain around the user. Terrain is loaded in chunks which are added and removed around the user.
Challenges we ran into
We would have liked to load the application into VR, and use speech-text to control the environment, but neither of those would work. We could not get the application to run on a Google Cardboard (Being our first time working with VR), and our speech-text system would not run on mobile.
Accomplishments that we're proud of
We also built a second project along with our first, more as a cool application related to RedBull.
What we learned
We learned a lot about programming in Unity. We've had little experience in Unity and C# up to this point, and this really helped us develop those skills. We also learned a lot about computer vision, and OpenCV with our side project.
What's next for God's Sandbox
Uploading to mobile VR and implementing speech-text for terrain control would be our next steps for God's Sandbox