ZOMBIE SCREEN CAPTURE
The marker that we designed to synchronise universes across devices.
AR is the future of mobile gaming, and we wanted to build on the revolutionary experiences available aplenty on the app store. So, we built the world's first real-time multiplayer AR experience.
What it does
Take Call of Duty: Zombies. Now, put it on your phone. Cool right? Imagine if instead of boring 3D worlds and touch controls however, you were actually traversing the battlefield, fighting for survival with all of your friends. Although it's a little tough to verbalise, we think our video demo below will help clarify how cool of an experience it actually is.
How we built it
We use ARKit for VIO, SceneKit for object rendering, and are integrating automated occlusion via a CNN (SegNet) in CoreML. Socket.IO and Node.js take care of the real-time object transmission and inter-device awesomeness. The rest is in the hands of the user.
Challenges we ran into
It was very frustrating to design efficient communications structures to allow for optimal cross-platform object encoding. SwiftyJSON came to the rescue, but we would like to expand further by implementing more sophisticated custom objects.
Accomplishments that we're proud of
While there's definitely a lot of room for improvement, we are absolutely ecstatic at the fact that we managed to build a convincing shared-dimension experience. We're also really proud of the theoretical maximum of 50 nodes with our current setup, as this encourages us to take our idea into practice.
What we learned
We learnt that planning is absolutely pivotal, especially in a 24-hour crunch such as hacknroll. Hydration and sleep are key, and frequent sanity checks and tests prevent the buildup of pressure for last-minute debugging.
What's next for AU - Real-time multiplayer AR experience for iOS
We want to refine the experience and implement high-performance automated occlusion, then push it to the iOS App Store to receive feedback and suggestions for other experiences we could create.