The Iron Man suit. Because, who wouldn't want to try it after all?
What it does
The project consists in two separate applications: the Android controller app to input the hand positions, and the Unity VR experience in Oculus Rift. The player is controlled by tilting the arms, and the AI provides many missions to test your new skills
How we built it
Storing the accelerometer information of the mobile phones in Firebase, we were able to design a flexible way to input data into VR experiences without the need to buy additional hardware. Once that was set up, we divided the roles to pack as many features as possible inside our demo. Some of us spent most of the time developing the assets and crafting the visuals, while others focused more on the user experience and fun factor.
Challenges we ran into
Getting the Oculus Rift to connect to the computer, creating the city, programming the physics, connecting an Android app to a Unity game in realtime, exporting the textures from Blender 3D to Unity.
Accomplishments that we're proud of
The experience that we were able to design and develop this weekend. Having a dynamic sound system with an integrated voice assistant that guides the user throughout the game.
What we learned
How to set up Unity for the Oculus Rift and other VR platforms. To work with Blender 3D in conjunction with Unity.
What's next for SuperFlight VR
Improving the city, adding missions with interactive components that could affect the scene.