UPDATED

I had been working on this engine & game for a year in small pockets of free time. Originally it was a single player, stationary game where enemies spawned on the other side of a portal in mixed reality. For this hackathon I added multiplayer support.

Inspirations

Probably a very common answer, but Ready Player One. I want to live in the Oasis, be able to travel from planet to planet, go anywhere and do anything with anyone. That is my ultimate goal.

What it does

Im building a mixed reality wave shooter game. Instead of guns, you use spells, cast with drawing a series of runes in the air with your fingers & hand tracking. The rune drawing focuses on a grid pattern, NOT ai, for reliability and performance sake.

How I built it

I'm a solo developer. This is a side project I work on 2 hours a day before work. Runs in a custom (wip) voxel engine built from the ground up in native/vulkan/openXR. Starting from basically just a vulkan renderer, I built up the engine around it. I originally implemented finger tracking as just a stepping stone to controllers as it was quicker to implement. The more I delayed adding controller input, the more I became attached to finger tracking input. I started to see finger tracking not as a gimmick, but as a viable way to interact with games. I noticed how most finger tracked games had annoying locomotion methods that broke immersion for me and felt clunky and like a hindrance and a bother. I decided that I wanted to focus my engine only on finger tracking and innovate to open a path for finger tracked games to be deeper and more rich. It seemed while there were some cool tech demos, nobody had really tried to make anything of substance with finger tracking.

Challenges I ran into

Moving from Unity to native, getting my head around Vulkan was very slow and painful. OpenXR's extension system was very confusing at first. Lack of good documentation made it worse. Having mostly built my game before the hackathon and submitting it as an update, retro-fitting multiplayer into it was a lot of hard work. If I had more time I would properly implement client side prediction & rollback and host/server authority. Right at the end my frame rate started tanking without making any changes. I realised suddenly the game was being thermally throttled when it wasn't before. I expect meta pushed a change to the settings in this regard, so I had to cut the rendering costs by downgrading graphics a bit.

Accomplishments that I'm proud of

Fluid, responsive and accurate game controls with hand-tracking only. Opening a path to a richer hand-tracked only game. I'm also proud of being able to retro-fit multiplayer into a game that wasn't built for it, in such a small amount of time.

What I learned

I already sort of knew this, but its very tricky to add multiplayer to something built as single player.

What's next for Runes: Unbound

I wanted to add a hand-free innovative locomotion technique that I have as a prototype into this game but ran out of time for this competitions build. Then procedurally generated planet-scale open worlds with modifiable terrain. I have an LoD system already in place, and the architecture already supports planet scale worlds, even in multiplayer. Not only do I have a streaming system that loads in new parts of the world at runtime, but even OTHER worlds. The end goal is to stream in a 'space' world if you go upwards enough, and have a seamless planet to space to other planets traversal.

Built With

Share this project:

Updates

posted an update

had a bit of a disaster! Suddenly my game ran at 45fps with no changes, spent a few hours trying to fix it, couldn't work it out. Then I checked the over metrics and my battery was in the red, it was thermal throttling me. It was fine if I turned off the environment, but you have to have one of those.

Luckily, I built in a level of detail system ages ago. So I just switched it to the next level of detail, where the voxels were twice the size they normally are in it worked great, 90fps. The environment got a bit crappy, but thats better than unplayable fps. Gonna have to put in something that automatically detects throttling and lowers environment resolution by itself. I haven't touched the lod system in god knows how long, luckily it still works fine.

Log in or sign up for Devpost to join the conversation.

posted an update

Things are going well, the gameplay is synced and working, the ui is being refined, and I think everything is on schedule.

After the bulk of multiplayer was done I started working on some usability features that were lacking. For example a reference to the spells so that you can look them up in game. I also added a proper game over screen instead of harshly cutting back to the menu. I also made a health indicator, since there wasn't one previously, and had to network sync that too. Then I had to alter the game flow a little to accommodate an extra player, for example, a downed state when one player is hurt too much.

So at this place, the game works all the way through for multiplayer, maybe a little rough around the edges, but working. :)

Log in or sign up for Devpost to join the conversation.

posted an update

I finally finished the compression and decompression of the finger joints for network sending and receiving! This is a huge step as it should now unlock multiplayer gameplay, since all the interactions are finger tracking based. Now I should be able to cast spells across the network :D

Log in or sign up for Devpost to join the conversation.

posted an update

Completed my network connection test. Now I have a peer-to-peer connection established.

Next up is to start sending data packets, which is a whole mess itself. I have to architect how and when to packet data, unpacket it, I need to work out how to route it to the correct entity, I need to add in compression and all that jazz.

To send hand tracking data is a lot of data to send. Finger positions and rotations, 3 floats for position, 4 for rotation, for every tracked joint is a hell of a lot. Fortunately I've done this before and have a good plan in my head for dramatically shrinking the size of data I need to send.

One thing I do is make all finger joint positions relative to their parent. Unless you have freakish hands the distance from their parents can't be more than say 15cm, so the floats I pack don't need to have a full floats worth of precision. I also do the same for hands relative to the head. And for angles I clamp them within a range so I don't need to cover the whole range.

Once I get a min and max range for a float, I determine the precision I need. I divide the difference between min and max by the precision, to get the number of steps I need to support. Then I round to the the closest step and convert to an int, and send that in my data packet.

For example, if my range is 2.0 to 8.0 and I need to support precision of .25: (8.0 - 2.0) = 6 6 / .25 = 24 steps. Int range: 0 to 24 which I can then compress into 5 bits instead of sending a 32 bit float

So, if my value was 3.78 3.78 - Min(2.0) = 1.78 1.78 / .25 = 7.12 Round to int: 7

Then on the receiving end I do 7 * .25 = 1.75 1.75 + Min(2.0) = 3.75

So I lose precision from 3.78 to 3.75, but the precision loss is tunable.

On another note, on the client sending the data, to make things deterministic, I also round their own copies of the variables this way before using them in other calculations. This way I know that both clients are making the same calculations from the same starting points, without the fuzziness you normally get from floats.

This is also useful when checking client side prediction, you can check the bits are the same. I have done this in previous unity projects, but for the scope of this jam and my custom engine I won't have client side prediction/rollback and server authority. But thats the plan for the engine later on.

Not sure if this is the typical network approach, but I just kinda came up with this myself years ago.

Log in or sign up for Devpost to join the conversation.

posted an update

Started retro-fitting multiplayer. Started by creating a second Player object. at the moment I just place it next to the original player and see what happens.

Camera gets parented to the player inside players instantiation code - changed this so the camera wouldn't get incorrectly parented to remote players when they join. For testing remote player just copies the host players finger joints - There was a race condition that caused the remote players hand graphics to be improperly instantiated - fixed. Remote players didn't get spells added, now they do. Spell casting grid faces the camera, incorrect for remote players - added a head reference and used that as the target for the spell casting grids facing direction. Also had to create a head graphic for remote players. Now the remote player can cast spells. Enemies didn't cache the player, simply called GetPlayer from my manager class every time they needed the reference in behaviour. Now I added a method to aquire and cache a target player, so that the game can support multiple players. For now, simply chooses the closest player. Had to go through and change all the hard coded reference to the cached target. Finished 2/5 enemies, next I need to finish the others.

Log in or sign up for Devpost to join the conversation.