Inspiration

The inspiration for the IP came from mech related franchises that have been built over the years. Was a mechwarrior player back in the day through it's various releases and spin-offs. Always enjoyed RTS games with mech components as well. The functional inspiration came from ongoing research in XR HCI through various prototypes i've built with different teams over the years for applications in manufacturing, energy, defense and gaming. I saw this competition as an opportunity to revisit various concepts I learned from developing some personal projects like MageWorks and Blast Point. Some other VR games I've enjoyed with intuitive HCI includes Half-Life Alyx, Fuji, Space Pirate Trainer, and Until You Fall among many others.

What it does

Drone Core Command is a game where players have an inventory of different 'Drone Cores' they can equip to modify a gun attached to their arm that does direct damage, or controls their pet drone to do the same. The design of the game's ux under the hood looks for clever ways around some of the constraints with gesture recognition to ease the load of the actual recognition, but still allow players to naturally react to the environment around them for interactivity. The goal of the game is two-fold: to work with and command a drone out in the battlefield, and then to repair any damage it may have received while in battle. For this iteration of the game, damage is repaired with simple puzzles each with a unique mechanic/mini-game.

How I built it

I built the game using unreal engine and more specifically the VR template. The initial concept of picking up and replacing cores was tested for viability. Once this HCI was worked out, I pushed this into a core-loop setting to see what it felt like to battle with a pet from a fixed stand point, and then work in the hub to repair the drone or customize a loadout. Map selection was also a component/feature I was working on, but in the interest of cutting scope creep, I put this on the backburner in the meantime to focus on the UX of the core loop.

Challenges I ran into

Gesture recognition, simplifying player input, and designing the game to be fast and intuitive. I found that gesture recognition wasn't always reliable based on various hand positions, and the difficulty a sensor might have to interpret a hand position at various angles due to line of sight predictions. So coming up with clever ways around this was an important part of interaction UX. Trying to limit the amount of time a player needed to keep their hands up was also a part of the challenge due to the physical limitations of this constraint. Interactions had to be quick and easy. The aiming/shooting of the gun was also a challenge because when a player moved their hand into a fist gesture to activate their gun, the aiming wasn't necessarily smooth enough to be maintained as desired, even with various lerping tests. But using a player's vision LoS in conjunction with hand gestures proved to be a way to accurately target enemies from afar. The design of the levels were also very specific to the type of gameplay. The player needed to be higher than the landscape field of play for better targeting with direct traces vs ballistic traces. And drone's AI needed to be designed so that ranges were far enough away to allow both the drone, and the player, to have a clear line of sight to their respective targets.

Accomplishments that I'm proud of

Building this in a month wasn't easy, but i'm proud of what I accomplished in this timeframe. when I wrapped up the latest build, I really wanted to start building more levels, experiment with different weapon types, and continue to polish the drone's AI. I was also really impressed with how far the dev tools have come from Epic and Meta to make pushing something like this viable within a month's timeframe.

What I learned

I was reminded fairly often about the constraints of XR, and how designing for this medium really does require a unique perspective on game dev. Traditional interactions rarely applied, and finding new ways to take advantage of spatialized gesture recognition as a priority over traditional interaction was humbling. Keeping things 'gamified' and as accessible as possible is always humbling when developing in XR with considerations for all types of players and gamer preferences.

What's next for Drone Core Command

A handful of thoughts and ideas came up during development that i wanted to explore, but had to restrain myself so could finish a demo with a clear core loop. Some of these ideas were progression systems where player repairs would earn points to advance their pet drone on a skill tree giving them much more agency. Repairs needed were also supposed to included debuffs for the mech, but this didn't make it into the demo. The map room will be useful when I can actually build multiple battlegrounds for players to select from and each with unique enemies and ground hazards for the pet drone to avoid. and last but not least, I want to use a portion of the hub to be able to remotely pilot the drone that drops the airstrike, but for perk-retrieval missions in another sort of mini-game type environment. More specifically, if a player could use their hands to steer a virtual steering wheel, they could pilot the drone through 3d 'top-down shooter' type environments all from within their hub, and trying to collect perks that they would benefit from while in combat.

Built With

  • unrealengine
Share this project:

Updates

posted an update

This past week or so wrapped up with iteration focusing on the reconnaissance room, map room, and some puzzle repair logic.

On the reconnaissance side of things, the mini-game has evolved to include ship selection and mission selection. the functionality of the game is working, and soon each mission type will award the player a perk for main gameplay. in addition, selecting a ship in the recon room will also be the ship of choice when using an airstrike drone core.

In the map room, level progression logic has been built so players can choose a territory, and then a specific tile in the territory for when they deploy. the goal here is part of the main gameplay loop to control all of the territories and thus unlocking subsequent levels with more complexity / challenges.

Last but not least, I also had some fun iterating on the micro parasite puzzle. Am starting to build abstract sub-assemblies where puzzle logic, previously placed on basic spheres, will now be adapted to more complex assemblies.

dronecorecommand #gamedev #vr #metaquest3

Log in or sign up for Devpost to join the conversation.