Inspiration
We're University of Washington students completing a project for a class capstone (CSE 481V).
For the class we're required to develop a VR application on Quest 2's, so we figured whatever we built we might as well toss at this hackathon as well.
COVID locked down people's lives. We spent a lot of time apart. Once lockdown ended, all that pent up energy from lockdown needed somewhere to go.
The ancient art of mini-golf is a wonderful way to connect with friends and exercise, but getting to a real mini golf course is difficult and costly for many.
Do you smell that? That's the smell of a Mixed Reality opportunity!
Enter GAWLF - Golf Augmented With mixed reaLity and Friends and generative ai. A free, accessible, mixed reality multiplayer mini-golf experience! (For the low low investment of $200 for a Quest 2 headset /s).
What it does
GAWLF lets multiple users within the same physical room play a virtual game of minigolf together.
Key features:
- Grabbable Golf Physics. Grab a golf club and HIT. SOME. BALLS.
- Mixed Reality. Play together with friends in your real room. The balls interact with objects in the room as well as the floor and walls.
- Shared Spatial Anchor Multiplayer. Don't just hit balls alone, do it with friends.
- AI Commentator. Your game is commentated by a live GPT based announcer, who reacts to the events in game.
How we built it
Our app was built in Unity. Key plugins/extensions we used were:
- The Meta Quest development plugin
- Photon Unity Networking
- Photon is a networking library + server hosting service that lets us get users together in lobbies and share important game information (spatial anchor IDs, gameplay events).
From the Meta Quest platform key features we use in our app are:
- Passthrough
- Passthrough lets users see where they and their friends are in the room when they play. Passthrough really sells the mixed reality experience (making virtual objects feel anchored in real space) and also greatly reduces motion sickness when playing the game.
- Shared Spatial Anchors
- Shared Spatial Anchors let us align the world origin across all players' headsets, so that everyone sees virtual objects in consistent locations in the room.
- The Interaction SDK
- We use the interaction SDK for grabbing and manipulating game objects as well as UI elements in the scene.
- The Voice SDK
- We use the Voice SDK for text to speech to generate the announcer's words.
Challenges we ran into
Physics
We initially thought the golf physics would be the most trivial aspect of the game since it's such a core component of what Unity provides, however surprisingly working on the physics ended up being one of the most time consuming parts of the game.
A key issue was that the Meta provided Grabbable components updated the position of the club every timestep by directly updating the transform of the targeted object. This interacts poorly with Unity's physics system, and after a few different attempted solutions we ended up determining that the solution was to write a custom component implementing ITransformer which instead updates the objects position through RigidBody.Move.
Big shoutout at this point to the Unity forums, and in particular Edy: https://forum.unity.com/threads/move-dynamic-non-kinematic-rigidbody-by-setting-transform-position.1361386/#post-8587141
Networking
Overall Photon's documentation was pretty excellent, but it still took a decent amount of time to develop networked features. Understanding ownership + how to properly send RPCs for instantiation/deletion of objects was a fun challenge.
SharedSpatialAnchors
Shared spatial anchors were the bane of our existence. Even just successfully running the provided SharedSpatialAnchors demo took a lot of trial and error. Once again, we want to shoutout a forum hero: Shavier: LINK. There were multiple bits of red tape that were required to be able to enable shared spatial anchors that we felt were not properly documented in the docs for the SSA sample: LINK.
Accomplishments that we're proud of
We're very proud to have been able to create an MVP which combines all of the elements that we set out to include:
- Golf physics
- Multiplayer
- Mixed reality
- AI commentator
The final product has a chaotic vibe where you just get to have a good laugh with your friends as you grab golf clubs out of their hands and slam their balls off the course.
What we learned
Things we learned:
- Managing version control with Unity. We developed ad hoc practices around creating a separate scene for each
gitbranch to avoid complex merge conflicts. - Photon Networking. Photon's API was a super interesting thing to learn. We really enjoyed learning the abstractions Photon provides + feeling like we were mastering how to use networking to enable envisioned features.
- Unity's Physics System.
What's next for GAWLF - Golf Augmented With mixed reaLity and Friends
Well the course isn't over for us, so we've been continuing to work on the game since.
Major features we've been focusing on are:
- Procedural golf course generation.
- Improved golf physics feel.
- Ball traces (line renderer for seeing the ball's path).
- More responsive AI commentator (currently when overloaded with events the Text to Speech will fail to produce audio).
- Audio feedback (audio feedback for things like hitting the ball or completing a course).
- Powerups.
- Verticality (allowing courses to build on top of tables in the room).
Try it out
Download the APK here! (LINK)
Unfortunately being able to use the multiplayer features requires being added to the release channel through the Meta developer hub. Feel free to comment below with the email tied to your Meta Quest account if you'd like access!
Built With
- meta-presence
- passthrough
- photon
- quest
- scene
- sharedspatialanchors
- unity


Log in or sign up for Devpost to join the conversation.