Play Console turns the MX Creative Console into a physical orchestration tool for live AI improv scenes. Instead of typing prompts or pausing to configure settings, users shape an evolving improv scene in real time using dials and buttons that map to core improvisational dynamics such as energy, pacing, agreement, emotional intensity, and scene progression.

In the current experience, a human plays one-on-one with a live AI improvisor. The user acts as both participant and conductor, using continuous controls to heighten or ground the scene, increase tension or collaboration, and speed up or slow down the flow without interrupting performance. Discrete buttons trigger scene events such as initiating a status shift, marking a beat, or transitioning to a new phase of the scene.

Play Console uses the Actions SDK to support multiple modes, such as rehearsal, performance, and teaching, where the same physical controls take on different meanings depending on context. At any moment, the user can step into the scene as a performer, while still subtly steering the overall dynamics with the console. Rather than scripting dialogue or generating jokes, Play Console gives humans a hands-on, non-visual way to guide emergent interaction, making AI improv scenes more expressive, playable, and collaborative. The same orchestration model is designed to extend naturally to multi-agent scenes in the future.

Built With

Share this project:

Updates