XRSynth is a project that explores a new and unorthodox way of interacting with synthesizers. By mapping the physical behaviour of one or more virtual objects to the parameters synthesizers, it is possible to generate an environment that encourages a different approach to sound design. This may allow composers and artists formulate ideas based on generative inputs. It also allows for those without any knowledge of sound design to engage with experimentation in sound. There is broad scope for this project, with application in video game development, augmented or virtual reality and lends itself to interest for end users with all levels of experience.

The example shown in this demo shows several objects mapped to different audio outputs. Two of the objects play a sample of audio when the collide with another object showing the potential for varying sounds. The third object has different components of the object mapped to parameters of a synthesisier plugin. In this case the y coordinate decides the pitch height of the synthesiser and the rotation decides frequency of an LFO amplitude modulator. This app was built by writing both a small virtual world in the unity engine in C# and synthesiser DAW plugin in C++ using the JUCE framework. The plugin was integrated in the unity framework using the new JUCE Unity integration system.

Inspiration

Trying to find new ways of interaction using 3d technology, virtual and augmented reality.

What it does

How we built it

Challenges we ran into

Controlling audio processing parameters with physical parameters.

Accomplishments that we're proud of

We have created a working user-responsive prototype of our instrument.

What we learned

We have discovered the new possibilities coming with the JUCE and UNITY integration.

What's next for XRSynth - VR and AR Synthesizer

User Interface tweaking. Implementing more advanced audio synthesis algorithms. Spatial audio implementation. Testing and optimization for the VR and AR headsets.

Built With

Share this project:

Updates