Attack your opponents in a shared space!
Use Image Recognition to heal up!
Hit them in real-time!
Shoot the enemies
Our team came in with the expectation to use AR to build a mobile, headset-free experience. While we've loved tinkering with AR and VR headsets, we know that mobile experiences will be the most accessible and widely-viewed, at least in the near term. Regardless of platform, single user AR/VR experiences can get stale as the novelty wears off. For that reason, we wanted to build a multi-player, shared AR experience.
As it stands right now, there are limitations to creating shared-AR experiences. There are only to known ways to achieve this. One is through third-party external hardware that configures two devices. The other is through GPS technology, but that comes at a high risk of inaccuracy (+/- 10 meters). We realized that we had to get creative in order to link the two devices without directly connecting them to each other.
What it does
BlastARs is a two-player dueling game set in an aliens vs. astronauts setting. Players shoot lasers at their opponent to lower their health. They can gain energy points by object-recognizing several pre-determined objects. Each user's position is tracked by the other in order to simulate a shared AR experience.
How we built it
BlastARs was built using ARKit, MLKit, Firebase realtime database, and Sketch for screen design. The project hinges upon two-way user tracking, which starts at game initialization. Users touch the backs of their phones together, offset so as to not cover the camera. Then, user's positioning data (relative to their devices coordinate space) is sent to the database. That data is pulled by the other user to reconstruct a positioning estimate of their opponent. From that point, users can have accurate interactions in their shared AR space.
Challenges we ran encountered
It took several iterations and lots of trial-and-error to figure out how to best configure device positions. This was a challenge not only from a development perspective, but also from a UX perspective. Because AR is a relatively new technology, we were faced with the responsibility of teaching users how to use AR technology. As a result, we knew we had to model the UI and UX to be similar to other full-screen camera apps on the market. We wanted the process of user configuration to be as straightforward as possible so as to not burden or confuse the user.
What we learned
We learned that users are hungry for multiuser experiences in AR/VR where they can interact with each other and the same objects. And lots of vector math.
What's next for BlastARs
We hope to optimize our experience so that user tracking is even more accurate. We also hope to write an open-source library for our tracking system for other developers so we can boost the number of multiuser AR experiences on mobile.