Polyphony Tutorial 2: Granular Synthesis
In this tutorial we'll look at using the poly~ object to generate large amounts of polyphony in order to play the contents of one buffer~ of sample data. We'll leverage the ability of MSP to play sample data from the same buffer~ at multiple arbitrary speeds and time points to explore the technique of granular synthesis
Put simply, granular synthesis is the use of very short (or, sometimes, less short) sonic events called 'grains' to generate complex textures. While the musical and written literature on the technique is beyond the scope of this tutorial (see Curtis Roads' Microsound (MIT Press: 2004) for a great exploration of this topic), we'll cover the basics here. While classic granular synthesis relies on the use of very small amounts of wavetable data, the technique we'll explore in this tutorial uses sample data taken arbitrarily from soundfiles.
In our tutorial patcher, we'll create an algorithmic playback system based on constrained random values to control the following parameters of a polyphonic sample playback engine: rate, onset point, duration, pitch, amplitude. We'll also look at how adjusting envelopes changes the sonic output.
Experimenting with the patcher
Take a look at the tutorial patcher. There are several numbered areas, each of which controls part of our granular synthesis engine. The patcher area labeled metro object schedules and fires messages into a poly~ object that has loaded 100 voices of an abstraction named . Area allows us to check our CPU usage depending on the parameters of our synthesizer. Area and set the synthesis parameters - the sample we're using, which area of it to draw from for grains, and the parameters of the grain playback system in the abstraction.is the grain emitter proper: a
The poly~ object in our patcher generates grains: single bursts of sample playback which we can control dynamically by adjusting parameters. The metro and button objects control the grain emitter. Each time the metro fires, it sends a into the poly~, prepended by the message, which assigns the to the first available voice within the poly~. In addition, each from the metro object schedules the next one by adjusting the speed of the metro. The random object generates a random value which is then put through a scale object with a variable output range, defined by the and parameters found in patcher area .
The adstatus object allows us to control and view aspects of the MSP audio driver currently running. All of the viewable attributes of the Audio Status window (available under the Max Options menu) can be accessed via the adstatus object. The mode of the adstatus object (set by its argument) instructs the object to receive messages and output the current CPU usage of MSP. Notice that when the grain emitter is turned off, the CPU usage drops to . This is because our poly~ abstraction mutes itself when its playback has finished. When no notes are firing, all of the copies of the poly~ abstraction should be muted.
Before we look at our poly~ abstraction, notice the effect of longer and shorter grain rates and durations on the CPU usage. Longer grain durations and shorter grain rates result in more voices inside the poly~ being active at any one time - either they are fired more frequently, or they take longer to 'free' themselves, or both. The result is a higher CPU usage.
Depending on your computer architecture, you can take advantage of multiple cores in your computer's CPU (or multiple processors if you have a multi-processor machine) by dividing the poly~ object's resources over multiple threads. In essence, this divides the instances of the poly~ object across the different cores or processors of your computer, allowing sets of voices to run in parallel. Depending on your computer's CPU architecture, this may provide a significant boost in performance.
Inside the patch
The in object at the top of the patcher) and uses it to generate a grain of audio, using the MSP logic at the bottom of the abstraction. The trigger object at the top of the patch clearly sets up the order of events for generating our grain:abstraction recieves a single (via the
First, the thispoly~ object receives a and message in immediate succession. This turns on (unmutes) the signal processing in the instance, and sets it's state to 'busy', so that it won't receive any more note messages until the grain is finished.
Third, a random pitch is selected which is transformed into a duration multiplier for the line~ objects controlling the playback of the sample and its amplitude envelope. The !/ object divides the incoming pitch into , so that a requested pitch of tells the objects downstream to multiply their durations by (half as long, and up an octave).
Fourth, a random duration is generated, which sets up the parameters for the line~ objects so that they generate the appropriately scaled and offset values for the grain length.
Finally, a grain is triggered by generating a random start point based on the highlighted areas in the waveform~ object in the main patcher. This eventually generates two messages which command the two line~ objects to generate the playback curve for the play~ object and the amplitude envelope for the *~ objects.
Once the 'envelope' line~ is finished, it sends a to the instance and set it to 'free' ( ), so it can receive a new message.
The poly~ object allows you to have a large number of instances of a single, simple MSP patcher. You can use send and receive to communicate to all instances of a poly~ abstraction, which can be distributed across multiple cores or processors with the message. The adstatus object allows you to access and change aspects of the MSP audio driver; the argument to the object lets you see how much of your computer's CPU you are using with a patcher.