We envision a future where almost everyone can be augmented/mixed-reality enabled, opening a whole new world of creative possibilities layered on top our physical world.
What it does
In this prototype, we presume the artist is building their mixed-reality performance via a set of steps; loading in a physical stage, music and assets into the editor and designing an augmented performance featuring a real human dancer interacting with an augmented reality character. Our demo allows the viewer to change camera angles, scrub the timeline and change the time of day to fine tune the presentation and see how various audiences would see at different moments and physical locations.
We present this as a proof-of-concept where performers of the future interact with the digital, physical and automatable worlds.
How we built it
The demo was built using a modified version of Simmetri, a tool designed for creating interactive art. We worked with an interactive artist to seek out the core capabilities needed to generate a collaborative mixed reality design tool that exceeds the current limitations of current game engines.
We integrated with Verizon ThingSpace (IOT); our mixed reality performer dynamically controls a nearby street-lamp during our mixed-reality performance. We also drew on excellently-prepared 3D and animation assets from Adobe Mixamo.
Challenges we ran into
The concept is ambitious, to say the least. Essentially we want to expose a suite of game-builder like tools which involve a high degree of UI and implementation complexity. Ultimately we reduced our scope and landed on a few proof of concept features that give a sense of its core interface.
Accomplishments that we're proud of
Scrubbing time real time, toggling views seamlessly and tightening the iterative loop of timing and placement. Also, we're quite satisfied with how easy it was to interface with Verizon ThingSpace to cause a physical light to activate via the app.
What we learned
Unlike a traditional game engine where you press play to wander around as _ time _ ticks on, we've exposed control of time as an integral part of the editing experience and have begun to envision the requirements needed to connect the offline editing domain with the real-time, interactive experience domain. Editable time and space controls in game time, the essence of mixed reality performance design.
Also, with the Verizon ThingSpace, connecting VR to IoT was easier than expected and bodes promising for integrating into mixed reality.
What's next for SpaceTime Studio: Mixed Reality Performance Designer
We ultimately envision a full suite of editing and creation tools to build out a platform that allows the artist in everyone the ability to painlessly produce these next-generation experiences and performances.