With the Global Archiact jam having no theme, I wanted to create something for a Google Cardboard based Android VR project that would be simple to understand, easy to pick-up but create competition amongst those who play it. I recalled the time spent playing the classic game "Joust" in which players must joust (which mostly consisted of landing on) opponents, thus scoring and winning.

What it does

Fwoosh is a modern, multi-player VR variant on this idea. You move around the enclosed 3D space by looking where you want to go with a VR headset - in this case, an Android powered phone inserted into a Google Cardboard headset. Your ship will continually move forward at a slow but constant speed and you can change your direction and orientation by looking around. With Google Cardboard you have a single button and by pressing this, you can "FWoosh" your ship, which is to gain an instant boost in speed, before returning back to your previous forward velocity. You must wait for a cooldown period before you can FWoosh again.

The aim is to strike the engines of your opponents ship, with the bow of your own ship. If two ships collide, nose-to-nose, there is a disorientation phase where both ships are "stunned", are not moving forward nor can they rotate, making you vulnerable to other encircling vessels.

Dotted around the enclosed play area are obstacles, which will slow you down or stun you like the bow of an enemy ship. But also power-ups, such as a protective shield or faster "FWoosh" recovery.

How I built it

FWoosh is powered by Unreal Engine 4, programmed predominantly in C++ and making use of (and contributing back to) some community plugins.

Challenges I ran into

No current Google Cardboard integration in the Unreal Engine!

Accomplishments that I'm proud of

Will expand later in the Jam!

What I learned

Will expand later in the Jam!

What's next for FWoosh

Orientation driven UI - look around the world and the centre point handles like a mouse cursor, then press the Cardboard's button to activate the in-game option.

Android Orientation in UE4 Plugin - I've already achieved something similar, but in a different engine, so porting this over.

Update 1 - 15/04/2016

The past week has been spent porting my previous implementation of an AHRS (Attitude and Heading Reference System) algorithm into a UE4 Plugin. There is a project called "SimpleHMD" which provides an external plugin for dual rendering with lens distortion. Using this as a starting point, I've begun porting over the AHRS code, starting with the Android build approach, as the existing orientation code updates based on device rotational values, so sensor drift (mostly from the gyroscope) is a significant issue here!

The approach used is an adaptation of Sebastian Madgwick's AHRS algorithm, the original can be found here: With the source here:

The core algorithm has been retained, but I have many several adjustments. First, the gathering of the sensor values make use of an ALooper instance from the Android NDK. With this, I gather all of the sensor values as fast as Android can throw them. Next, the AHRS algorithm itself resides within a FRunnable class in the Unreal Engine, with available update methods to pass in the values from the device sensors - making use of mutexes (in Unreal's case, FCriticalSection) to ensure everything is thread-safe when reading/writing. Also, to prevent constant variable creation and memory allocation, I have moved many of the variables to private class members and simply reuse them. The Unreal implementation is nearly ready, I expect to have it finalised after the weekend!

As I'm primarily a programmer, FWoosh is going to contain a lot of "Programmer Art". Simple primitives with a wireframe shader and basic particle effects are how the game is going to look - maybe I can improve on this... but I doubt it for the submission :-)

Share this project: