I looked for a project that would maximize the benefits of StereoKit. A mind map seemed an excellent fit due to StereoKit's ability to load assets at runtime, but I wanted to extend the scope and usefulness of mind maps to make them also be useful for training and entertainment. This would require complex configuration and test out StereoKit's dynamic approach to user interfaces.

What it does

Create linked node structures all around you to represent your idea. Use the inbuilt 3d shapes or images for the node and customise the colours or load your own. Nodes are added to locations, which are connected by portals. This allows you to arrange and optimise your idea and is an ideal way of using the space around you in VR / AR. Locations can also have their own background music.

Complex interactions can be included by adding buttons to nodes. You can hide and reveal nodes, change their colour, description and shapes and change a players health and score as they play through interactive scenarios. While in play mode, you can pick up configured nodes, carry them in your inventory and use them to interact with other nodes.

How we built it

I made extensive use of StereoKit's online documentation, demos and GitHub source code to understand how some of the features works. I love the hand menu and extended it to make accidental activation less likely while guiding the user in it's use. You now have to align two circles in 3d before you grip and trigger the menu.

Challenges we ran into

The keyboard input was a little restricting due to the lack of ability to control cursor keys. I didn't start the project until the 29th November so time was my biggest issue, but StereoKit helped me produce a very feature rich application in that time. I also struggled to fit all the functionality into the video.

Accomplishments that we're proud of

  • The concept of extending mind maps for training and entertainment.
  • Creating a feature rich project in a week.
  • Making use of so many of StereoKit's features.
  • The extensions to the hand menu.

What we learned

I feel that I have a good grasp on the capabilities of StereoKit together with it's design philosophies of hands-first and non-persistent user interfaces.


The game was created and tested using the Oculus Rift and I use a physical keyboard for input.

What's next for Idea Engine

  • Add customisable lighting for each location using spherical harmonics.
  • Investigate using for speech to text.
  • Allow user to record sound effects using their mic.
  • Let user configure sound effects as part of the interaction events.
  • Add multiplayer support.
  • Increase features for interactive training / adventure games.
  • Release on PC VR and mobile VR, taking advantage of OpenXR.

Outstanding issues

Sometimes the application has an intermittent crash in the SK main loop, but with nothing in the stack trace. It only happens some times and restarting resolves the issue.

Built With

  • c#
  • stereokit
Share this project:


posted an update

The binary and source code has been updated with some new improvements.

1) Upgraded to preview 5 of StereoKit to fix occasional crashes. 2) Nodes can now be edited with left or right hand. 3) Editing options are hidden away while you are actively editing another property. This stops accidental editing-icon selections and makes it clearer which input field has the keyboard focus. 4) Changed locomotion on the left hand to make it more consistent with the right hand and allow the hand to be used for interacting without accidentally performing locomotion. You now allow two circles before griping to initiate locomotion. 5) [More Info] windows can now be dragged around.

Next I will be proving that I can pass all the Oculus Quest VRCs that are required to release on the App Lab store and researching if I can release on HoloLense. Then I can invest more time knowing I will be able to release on all platforms.

Log in or sign up for Devpost to join the conversation.