Max for Live lets you create audio effects devices which receive audio
input from the Live application process it in some manner, and pass
its audio output either back to the Live application, or to other
downstream audio effects devices in the same audio track where the
device resides.
By convention, a Max for Live device gets all its audio from the Live application
using the
plugin~ object and sends its audio output using the
plugout~ object. Audio input and output is limited to two channels.
Note: Sending audio to another Max for Live
Audio Effect, Instrument, or MIDI Effect device using the Max
send,
receive,
send~, or
receive~, objects is not supported.
While creating a Max for Live audio device
can begin by using the
Max for Live device templates,
you can use some of the Max for Live Audio devices, MIDI effects,
and Instruments that come with Max for Live as your
starting point.
These resources include the following:
-
The
Big Three -
are a set of Live devices that will provide
hours of enjoyment and serve as examples of high-level Max programming. The collection includes
one of each type of Live device - an audio effect, a MIDI instrument, and a MIDI effect.
-
The
Audio Effect Tools demonstrate different tasks and aspects of device
creation.
-
Each of the
Max for Live Tutorials
contains a sequence Audio effects, MIDI effects, or Instruments
that you can use as starting points.
-
The
Building Block devices -
are simpler example devices that can be used as-is or incorporated into your own devices.
-
A subset of the original
Pluggo plug-ins
from Cycling '74 have been recreated as Live devices.
-
Additional
abstractions -
raw materials for the creation of Max for Live
Audio effects, Instruments, and MIDI effects are
available as
Max for Live abstractions
These abstractions may be
of use to you if you are interested in using Max for
Live to control some portion of the Live application
using the
Live API.
When Live sends audio or when MIDI triggers audio through an effect created
using Max for Live, the events should all be time-aligned (e.g., if a MIDI
note falls on the downbeat, the MIDI Instrument's audio should also end up
in the mix on the downbeat). A device can provide latency information
to the Live host application so that the host can use latency compensation to adjust the relative timing of different audio tracks.
When you use Max for Live, there are situations where it is useful to set
latency to counteract timing differences introduced in your signal processing.
If your signal processing patch requires you perform some kind of analysis
on a block of samples before any output is produced, you can enable latency
to make sure that the output from your effect will be correctly aligned
in the mix (whereas if you are creating a device in which signal delay is what
you intend, there's no problem and you don't need to set any latency).
-
With a device window as the topmost window, choose
Patcher Inspector
from the View menu to show the Patcher Inspector
-
Double-click in the Value
column for the Defined Latency attribute to show a cursor and text box.
Type in a value for the latency value in samples, followed
by a carriage return.