Concert Piece for Performer
Flame & Dust – a noisy struggle between a flamenco player and the machines (ft. Julius Naidu)
With this piece I wanted to merge noisy and glitchy sounds with a strongly rhythmic tonal lead. I also wanted to utilise audio analysis and machine learning to create a piece that could directly respond to what the performer plays, but in a complex enough way that it may seem “intelligent” and “autonomous”.
The result is a flamenco-inspired piece cantered on a responsive granulation-based software. I made this using MaxMSP to extract features from a guitar pickup and a contact mic, as well as to create a modulating granulation synthesis. This part of the software is then complemented by Wekinator, which receives the extracted features, extrapolates patterns over time, and finally modulates most of the accompanying synthesis and effects.
The final version of this augmenting software for guitar ultimately had to be limited in its use of Wekinator and ML. The slow response I would get during performance pushed me to look for a faster alternative to ML pattern matching. In the end my chord detecting external (wrote in C++ using my wrapper for Max 7 available on GitHub) is handling the chord-related response whilst a custom beat detector and a spectrum-based algorithm complete the hardcoded musical reactions. Wekinator is then only “looking” at the extracted audio feature to adjust some parameters in the granulation engine depending on the degree of match with its trained models.
The inspiration for the structure of the piece came after seeing a tutorial video for Wekinator by Rebecca Fiebrink where she was playing a simple game using her voice and Wekinator as a controller. I then decided to create some sort of AI that controlled a noisy synthesis in response to the instrumentalist playing.
I chose to work with my friend Julius Naidu and a guitar and ended up creating a tool that granulates an array of DnB samples and “talks back” to the player by morphing its response with an envelope following algorithm. The piece is meant to be performed with a powerful PA and a large stage, so that the sound of the original acoustic guitar can undergo effects and be shadowed by the granulation at certain points.
The Graphic Score:
To score the performance I was inspired by graphic scores such as Picnic by McQueen or Towards An Unbearable Lightness by Bergstrøm-Nielsen. I wanted to tell a story for the performer to be inspired and to provide him with a map of the musical territory to explore. I then watched the movie The Desolation of Smaug and the idea crystallised into the final concept: a graphical map to guide the mood of the improvisation of a flamenco player in the “land” of growling machines (see score above).
Since the program I devised responds actively to what the performer plays, the performer is always in control (like the player of a video game) but the sounds can morph into unpredictable granulations (just as if the machine was alive –following the player and responding to the music). The “environment” can also change, depending on the frequency of detected beats or the hitting of certain chords, and this is conveyed by effects changing both the granular synth and the amplified guitar signal.
The score contains a set of instructions that the performer can read before starting to play. He then knows what actions trigger what. He is ultimately supposed to look at the graphic side and let the images inspire him. He should follow the trails and explore this machine-ruled environment by improvising and listening to how his actions shape the music and the effects. By following the paths, after exploring all the musical landscapes, he will be able to leave the imaginary world by preforming a specific chord at a specific point on the map. The composition would then end.
Post-Performance Note: I was really satisfied with the performance as it portrayed the narrative quite well (based on audience feedback and a live recording I took in Max). Perhaps the granulation synthesis could have been more aggressive at traits, but I think that overall Julius interpreted the score and the instrument in a very effective way. I was particularly pleased with the efficiency of both beat and chord detection and the way Julius used these wisely to tell a compelling story.
I was also pleased that I did not have to intervene at all, with Julius getting the score the way I intended it and the software reacting as expected –from the beat detection that initially activated the granulation down to the various player-based responses. Most importantly, the instrument’s reliability allowed the performer to focus on the score and on his own “dialogue” with the music, rather than looking at the screen. This for me was the biggest achievement.