XR Jockey

The spatial instruments in this demonstration are constructed in ableton and unity synchronously to build a multidimensional controller. The bass drum shoots a projectile each time it’s hit, and an electric pulse is summoned each time the keyboard is struck. With this example you can visualize the artist and audience engaging in more complex behaviors with each other. The unity asset store was used to gather several shaders and music was sampled to create this performance.
The audio track was generated using Ableton Live. Each MIDI note is sent to TouchDesigner through native support. An arbitrary OSC app on an android device provides GUI configurations such as xy-coordinate grids, toggles, buttons, and sliders that also send OSC channel messages to TouchDesigner over the network. The states of the control interfaces are then mapped to C# methods in Unity through the plugin osc simpl. TouchDesigner is known to be utilized by artists in creating performances, installations, and fixed media works, but the extent of its capabilities with AR is unexplored. In this setup the xy-coordinate grid is used to specify a xy-coordinate on a coexisting AR/VR geolocation, and an ImageTarget is substituted for a ground plane.
In this iteration are prototypes of audio/visual instruments that can be programmed beforehand or manipulated live during performance. Much like the practices of a DJ, we are allowed to borrow and transform other’s works of art. Therefore by introducing everything that game engines have to offer to the repertoire of live musical performance, we can also tap into the vast amount of user-created content to borrow, transform, and present new opportunities for experimentation.