Inspiration Inspired by Geometry Dash, Five Nights at Freddy’s, and Bloons Tower Defense. What It Does The game scans the VR player’s room and sends spatial data to a Supabase database. A web app hosted on AWS Amplify recreates that room for the tablet player in real time. This shared setup lets the tablet player spawn cute but creepy creatures around the room, sending them in to attack the core while the VR player defends it, using pitch based attacks
How we built it We engineered an asymmetric multiplayer experience by bridging a Unity-based VR application with a React web client via Supabase Realtime websockets. On the VR side, we leveraged the Meta MR Utility Kit (MRUK) to generate spatial meshes of the player's physical room. This boundary data is serialized and pushed to our cloud database. The web client, hosted on AWS Amplify, fetches this spatial data to render an accurate 2D interactive map of the VR player's exact physical environment. After establishing a presence-based handshake between the headset and the tablet, web players can execute touch inputs to spawn units outside the physical room boundaries, triggering real-time database inserts that instantiate 3D enemies in the VR space. For the VR defender's "One Input" mechanic, we implemented a custom Fast Fourier Transform (FFT) algorithm to analyze microphone data, allowing the player to combat incoming swarms purely by modulating their vocal pitch.
Challenges we ran into Architecting seamless real-time synchronization between a standalone VR headset and a browser-based web app posed significant hurdles, particularly regarding websocket latency and implementing optimistic UI rendering to maintain a snappy feel. Furthermore, translating complex 3D spatial anchors and polygonal boundary data from the Meta XR SDK into a usable, scaled 2D format for the tablet interface required meticulous mathematical conversions. Finally, tuning the asymmetric gameplay was highly iterative; we had to carefully calibrate an adaptive audio baseline for the VR player's pitch while ensuring the tablet player's spawn economy felt impactful but balanced.
Accomplishments that we’re proud of We are incredibly proud to have successfully bridged two fundamentally different platforms, a spatial VR environment and a 2D web application—into a cohesive, low-latency multiplayer loop. Successfully transforming a player's real-world living room into a dynamic, synchronized battleground is a major technical win. Additionally, engineering a purely pitch-based combat system provides a highly unique, accessible twist on the "One Input" theme that distinctly elevates the game's identity.
What we learned This project heavily expanded our expertise in full-stack game development, specifically in utilizing Supabase Realtime channels to bridge distinct development environments (Unity and React). We gained deep, practical experience working with spatial data, learning how to serialize and manipulate mixed reality boundary points to bridge physical and digital spaces. On the design front, we learned invaluable lessons regarding asymmetric balancing, particularly how to establish fair risk-and-reward loops when players have completely different interfaces and objectives.
What’s next for PITCH V POKE
Next, we want to improve performance and reduce lag between devices.
We also plan to add more enemy types, each with different behaviors and sound interactions.
We want to improve the visuals of the scanned room so it looks cleaner and more accurate.
Finally, we plan to polish the gameplay loop and make it more fun and replayable, possibly adding progression, upgrades, or different game modes.
Built With
- amazon-web-services
- amplify
- c#
- react
- supabase
- unity
- vite
Log in or sign up for Devpost to join the conversation.