Performance: ACMC 2017.
Artist: Scott Simon
Electronic soundscape and improvised guitar
Score to work "On the philosophical idea of the beautiful", by Scott L. Simon.
Performance ACMC 2017 @ Electronic Music Unit Elder Conservatorium, Adelaide.
Max 4 Live application. Find here a M4L application under construction. Note the series of message boxes that have cue points. These cue points connect to sample accurate positions within Gen patchers that are triggered by the DAW timeline. The sounds are very short grains that are triggered in relation to the input score. These triggers are in the process of being randomized and taking input from the user's playing. A notated score (harmonization of Central theme) will be the "material" upon which this process is worked.
This section will contain the development of the performance in its different aspects. A list of the different components as they are tested and produced.
Notes to performance:
1. Observation: the structure of the piece will be organized around the original "Grain_flute" patch shown above.
2. The patch will evolve through the levels (see score for details).
3. Another multi-oscillator patch will play chords based on an array produced during improvisations to the "Grain_flute" patch. This pattern of "paired patches" will be repeated through some of the levels.
4. The use of a C++ granulation app is called for. The app is written but the manner in which this will be integrated with the Ableton DAW is undecided. The guitar sits well with the texture of the grain patterns.
An essential aspect of the piece is an emphasis upon the process - an emulation of the forces of creation - in opposition to the emulation of the surface (that which "appears" as beautiful). This is represented metaphorically in the work's developing levels. The concept of "process" also maintains its hold across and outside of one definitive rendering - the work continues to evolve according to its internal logic.
August 2017
The C++ granulation process is complete - however as noted above it is not clear how to integrate it. The app as it stands can be controlled via midi to change the grain frequencies and distributions. The sound quality is aesthetically pleasing - various textures can be built up with different settings. In the right combinations of 800-1000 grains a "natural" texture can be constructed. Insect noise and forest ambience can be built this way.
The Ableton Live component is configured to be 4 groups representing the levels of the piece. The iteration of the max patches will continue until the performance. The second level makes use of developments based on the Wenzhou Biennial piece. This piece was based in randomly accessing audio files with a Gen~ patch. These blocks of audio are then played back (using poly~ structure to create overlap).
The use of the metronomic kick drum and the triggered blocks of audio create randomized textures and melodic areas. The audio file used for the process is a recording of a version of the final level (not used in the PhD but made after). The use of the notated levels in the space created is a possible way of transforming the level further.
September 2017
The process of developing the levels in Ableton live is mostly complete. Some observations about using the Avid Eleven Rack (11R) are offered here. The 11R is a guitar simulator and effects unit that can also double as an interface. There are 2 ways that it can be used in a live setup. It can be used as the audio interface with the guitar being "added" in over the top (the 11R adds the guitar signal to whatever audio is being passed through the interface). This is is fine but it does not allow the effects of Ableton and M4L to be utilized. Actually one can use the effects by "doubling" the guitar and running it through a channel in Ableton. This is a concern for some - it introduces latency into the second signal. Actually in practice it acts as a send that one can treat with Gen~ : the latency is negligible with a setting of 512 in the audio buffer of Ableton's preferences. This method creates 2 signals the "natural" signal (which has effects from the 11R itself) and the Gen~ signal - these are summed together for the output. This method sounds interesting.
If however one only wants the Gen~ signal and the more single-track sound another method is called for. The 11R is used as the guitar processor but not as the audio interface. The guitar is passed into Ableton and treated in Gen~ and then the master out of the Live set goes out another USB3 (or USB) outlet and into a second interface. More setting up is required and 2 interfaces.
The construction of the different "levels" of the work has been organized around a structure of dates. From the 2nd to the 12th of September new patches were produced that refined the basic "sketched in" versions of the M4L patches. These refinements were named after the day in order to push the pace of the process along and also to keep track of where everything was located. This has culminated in a complete set at this stage - some refinements are still required but full practice sessions have begun for the performance.
The focus of the work is now mostly around some augmented guitar levels and algorithmic drumming. The guitar augmentation is quite subtle in the first levels and it becomes more pronounced in the last.
A brief excerpt of augmented guitar and algorithmic drumming from the piece.