Machine learning is slowly being integrated into our daily lives through apps or advertisements. The biggest issue around comprehending neural networks is that between the input and output neurons there is a lot happening.
Virtual Reality is being used to help people comprehend large concepts by allowing them to reach out and touch abstract ideas.
What it does
The user is tasked with helping a car traverse its vaporwave home by adjusting the axons of its brain. A cloud randomly drops objects in the path of the vehicle.
The brain is represented using 3d axons and neurons that can be moved all around the play space. The car itself has three eyes that give the neural network inputs and the ability to turn as an output.
How we built it
All the models were made in Blender and the scripts are all coded in c#. The game was implemented in Unity using Oculus's developer tools.
Challenges we ran into
Some of us were inexperienced with GitHub causing problems with materials not syncing up correct and prefabs not working. Some prefabs just broke towards the end causing a large headache for our main programmer.
The Oculus documentation is not as good or straight forward as we would have liked. Meaning we had to come up with our own way of tracking the player's hands and how the buttons functioned.
What we learned
We learned how to develop games on the Oculus Quest, how to model in blender, and code in C# more efficiently.
Future plans for this project
+Make the neural network fully adjustable +Implement an automatic setting that lets the car learn on its own.