The Generations series of works uses the Evol algorithmic music system that transforms captured performance fragments over time. The fragments are iteratively transformed over time creating an evolving motif that gradually drifts away from the original to take on new charractersitics. The effect is a layered texture that transforms in subtle ways over time, reminiscent of the works of minimalist compositions from the 20th century. This continues my exploration of performing with reflexive music software where the musician’s performance fragments are captured, transformed and re-used as part of a generative musical accompaniment. The system works with MIDI data manipulations and was developed in Pure Data.
A revised version of the work, Generation II, was performed at the Australasian Computer Music Conference in Adelaide, Australia, on 29 September 2017. The Evol system is MIDI-based and so the timbral character of the work is determined by sound choices in the playback engine. In the performance of Generation I these were tonal analogue synthesizer tones, whilst for Generation II these were mixed with recorded samples and more aggressive glitch and noise based timbres.
The first performance with this software was Generation I, an interactive electronic music performance between a live improviser and ‘Generations’ algorithmic music system. It was performed at the Griffith University’s Music Technology department concert titled ‘New Interfaces for Music Expression’ on March 6, 2017.