Inspiration
You know those "Rage Rooms", where you can pay to break bottles and other objects in a room? I built that experience as a Meta Quest app! Throw bottles, teacups, plates and more from the comfort of your own room!
What it does
Play in AR or VR mode, and throw objects that collide and break with the environment! In the VR training range, you can practice your aim, and experiment with the different types of objects there are to throw. Switch to AR, and throw objects at your real physical environment and watch them break!
How I built it
Built on the capabilities from the Meta Quest APIs, I really tried to push the limits of what is possible with the technology. From the dynamic hand poses while grabbing objects, to utilizing the full capabilities of the Scene API, I wanted to combine all the technical abilities of AR/VR into an app that hasn't been made before.
Challenges I ran into
I spent a LONG time trying to iron out the little details. I wanted the unlimited object spawns on your wrist to be satisfying. I wanted to make the gameplay intuitive. I wanted the hand menu pop up only when you face your hand towards your face. And to do this, it took a lot of late nights tweaking small values. Trying to deal with the hand quaternions and how to appropriately read/change/modify these values for all these things definitely consumed a lot of my time. Did you know to properly read/set the VR hand quaternions, you have to multiply by the inverse of the headset quaternion? I didn't. But I know now.
Accomplishments that I'm proud of
I'm very proud of my target spawning algorithm that runs during the minigame mode. During 60 seconds, targets will spawn on the walls around you and shrink over time. I am particularly proud of my implementation of this in AR. I go into this during my video, but I spent a LONG time making this algorithm. When picking which wall a target will spawn on, it will do a probability pick based on the size of the wall, and will check to see if anything is in front of the potential spawn point. The end result is targets appear where the player expected them to, leading to a good and intuitive player experience.
What I learned
This is actually my first app for the Meta Quest. I've used AR Foundation in Unity before to make AR apps for iOS, but the Meta Quest APIs are different. I actually had a really fun time learning how the Meta Quest APIs work, and implementing certain aspects of them into my project. I leveraged the Scene API a lot specifically, and I'm really happy with how it turned out.
What's next for Break Room
I want to submit Break Room to the Meta Quest Store! (And am currently working on this!) I have some features I want to add next - improving raw hand tracking, and adding in a multiplayer scoreboard so you can compete with your friends.


Log in or sign up for Devpost to join the conversation.