Welcome to the Mixed Reality Toolkit
There have been many toolkits / frameworks and assets created to date to help developers to create Mixed Reality solutions, these are also then multiplied for each and every SDK / device that vendors have supplied.
Some solutions have made a start to try and unify all these capabilities together, namely:
BridgeXR Overlaps all the installable SDK’s with some common orchestration components with a VR focus.
Virtual Reality Toolkit (VRTK) Another VR focused framework which has been very popular with the community and has lots of support. Currently being rewritten to expand its reach.
HoloToolkit A true mixed reality framework utilizing spatial recognition & understanding and a wealth of Mixed Reality UX components to speed up development, however is wholly focused on the Windows Mixed Reality platforms such as HoloLens and Windows 10.
Within this landscape, the need for a true Mixed Reality framework was identified, being able to cope with all the various needs across the whole spectrum of Mixed reality, from VR (Virtual Reality), XR (Cross Reality) to AR (Augmented Reality). The framework would also need to be extensible enough to cope with the multitude of devices, both now and in the future. Thus the new Mixed Reality Toolkit (MRTK) was born.
Architecting for the future
Pulling in good architecture practices from multiple sources and finding the best patterns that worked with the Unity Game engine, the Mixed Reality Toolkit team crafted a high level architecture that would satisfy both the needs for current devices and capabilities, whilst still leaving room for the future:
The core architecture separates each area of required functionality in to their separate parts, ensuring there are no hard dependencies between them. This enables any single component of the framework to be easily replaced or extended.
New devices can be added according to the definitions set out by the framework and input is orchestrated in a non-platform dependant way. So that no matter what device is plugged in to the framework, a developers project will remain unaffected by the change and just receives input in a MRTK standardised format.
Everything is optional, aka “what’s in the box”
One core piece of feedback the team had in the formation of the project, was that everything needed to be optional, not all projects would need all the features of a full Mixed Reality Toolkit. Some projects only provide a VR experience, some might just be AR, others might only need networking / sharing as their backbone, so the toolkit had have the lightest footprint as possible, only enabling what you actually need.
What really challenged the team, was the thought that any code NOT needed should NOT even be in the project. If you have ever tried to build a framework where you users can go and delete large portions of code and still run, you should know this is a real challenge.
So as part of creating this “pick and choose” framework, the toolkit would provide “out of the box” the following systems to aid mixed reality projects:
- A UX toolbox - providing the usual VR / AR visual elements for building project
- Mixed Reality specialized shaders
- A multi-platform network sharing system - for sharing Mixed Reality experiences
- Abstracted Spatial Scanning system - for getting spatial data from different devices
- Spatial Understanding frameworks - to “understand” what the player is seeing
- Spatial audio - crafted for those mixed reality experiences and let the player hear in 3D space
- Abstracted Input system - Making a single way for projects to consume input from a multitude of devices
- Space recognition systems - to stop the player walking off a very real cliff (or over the cat)
- A uniform configuration system - to give you a one stop shop way to configure everything in the toolkit
Wait there’s more!
This was just a taste of what has gone on with the Mixed Reality Toolkit to date, summarizing months of discussions, code, rewrites, fights and the odd barn dance or two.
The team have been focused on forming a solid foundation for the project, focusing on Windows Mixed Reality and OpenVR as a start (the two main platforms for launch) and ensuring all systems work well, no matter the platform you deploy on. We’re not there quite yet, a lot of systems are in place and yet more are in the midst of being built, but it’s all functional.
If you want to quickly dive in and start playing, then check out the main Mixed Reality Development branch for more info.
P.S. be sure to check out the next article which will give you a 25 minute “getting started” guide to see what all the fuss is about.