When faced with the unllimited possibilities of the VR workspace and the limited one in reality, we were surprised that nobody has ever tried to utilize this factor. We brought two desktops to our hackathon, but switching through them was a pain and it wasn't enough for our us to use. Thus, we decided to create a Vr application that centers on providing a UI/UX workplace so the limitations of reality, both physically and economically, would not constrain one's productivity in the VR workspace.
What it does
We use Oculus Rift and Unity to create a VR workspace where you can show tabs and screens so that you can work in a larger environment without having to worry about buying additional monitors and lugging them with you. The VR worksplace is highly adaptable to any environment and you can bring it along with you anywhere you go. Also, by connecting it with VR, you can walk around and also have monitors follow you, and customize your own GUI to go with it.
How we built it
We used Oculus and Unity to build the VR environment. We also decided to add a Camera and Machine Learning element to show that it's possibilites are not only limited as strictly a replacement for monitors, but can also serve as an FPV HUD as well as overall mission control analysis. We had to create our own scripts for networking and using the camera data from the raspberry pi. We send the images taken from the camera to a computer, where it is processed through Machine Learning from Google Cloud API through it's facial recognition software, and then set into Unity where we can view the changes.
Challenges I ran into
The biggest challenge for the entire hackathon was the lack of a stable and fast wireless connection. This impeded us greatly and for the first 8 hours of the hackathon, we were unable to work. More problems persisted in the continuous dropped connections over the course of the hackathon, and the inability of my devices to connect to the internet. Many hours were waste in this fruitless pursuit. Other challenges included furnishing a materials and also making Unity work as it was our first time developing with VR in Unity. Getting the hardware to work was also hard as we needed to be able to interface between our different components (peripherals + computer) as well as get connections to work.
Accomplishments that I'm proud of
It was our first time developing in Unity and it is pretty cool how far we've come. We were able to show a proof of concept of what It might look like, as well as set up the raspi's to send over images that would then pass through google cloud platform for our machine learning. We defintiely are proud of the great strides in VR development we made during this hackathon, and are sad about
What's next for VR Mission Control
We're planning to further exand in developing a better UI, adding more features, and incorporating with other existing technologies to see market fit. Using this mission control will be invaluable in the future for furthering productivity and as a demo for how VR and UI can help increase productivity, add safety, and help further social growth and development.