Audio-Visual Interiors is a quadraphonic audio-visual composition performed by the live electronic duo of John R. Ferguson and Andrew R. Brown. Both composers have developed bespoke instruments for this work. Andrew R. Brown explores ring modulation synthesis controlled by multi-dimensional touch gestures; this provides a rich diversity of sonic potential whilst maintaining remnants of physical gesture. Various audio effects and algorithmic sound diffusion enhances and relocates what is otherwise a simple monophonic sound source. This instrument is written in Pure data and runs on an iPad. John Ferguson’s instrument employs Euclidean rhythms and sound-file granulation in the foreground. He has created a bespoke hardware interface that connects to a software environment written in Pure Data, which runs on an iPhone. Data from both performers’ interactions with their instruments are passed to a Touch Designer system, which is used to generate and/or manipulate various live visual materials.
This project contributes to the discourse around what is possible in the realm of live audio-visual performance. The work involves performance with live algorithms where performer agency is augmented by machine agency. The use of mobile devices, as well as free/open-source software, is an important undercurrent. Overall, the human performers remain central, but the audience engages with an audio-visual spectacle that is sonically and visually immersive, and there is a clear technologically-mediated agency, which reconfigures the role of what it is to be a live musician.
Ferguson and Brown performed ‘Audio-Visual Interiors’ live at the 2019 Australasian Computer Music Conference (ACMC), Monash University, Melbourne.