Very Basic Max for Live Jitter + Download

Max for Live Introduction 

When the developers of Cycling 74 met with both Robert Henke and Gerhard Behles from Ableton Live, Max for Live was soon formed. This programme allowed for the development of devices such as synthesisers, drum machines, audio effects, MIDI effects and the implementation of live video into the music creation and performance software using Max MSP. Max for Live allows Max MSP to be used to create MIDI effects (which processes MIDI data), audio effects (for processing audio) and for the development of instruments (takes MIDI performance data and transforms it into audio).

These devices can be created in Ableton Live for real time processing (ability to hear instruments as you develop them). With the series of specific Max for Live objects available (which all begin with live.), MIDI mapping parameters of a created device to a hardware MIDI controller is achievable.

Some of the objects include:

Live.dial: Circular slider or knob which outputs numbers according to its degree of rotation.

Live.gain: Decibel volume slider or monitor

Live.slider: Output numbers by moving a slider on screen

Live.toggle: Creates a toggle switch that outputs 0 when turned off and 1 when turned on.

By pressing command and m, these live objects can be mapped to any recognised MIDI controller, a GUI (Graphical User Interface) can be designed within the Max for live vertical limit to create both ease of use and accessibility for the user.

Max for Live works through access of the live object model, this map is a guide to everything accessible within Ableton Live. Not all parameters are available in the music software’s Application Program Interface (API), the live object model shows what max for live has access to.

ObjectModel

Creating a Max for Live Jitter Patch: 

Below I am going to briefly demonstrate how to create a basic Max for Live Jitter patch , I recommend you right click on each object used and select “reference” to read up on them more, it is one of the best ways to learn Max in my opinion. The download link for the created patch is below. The idea of this post is to demonstrate to people new to Jitter on how to create a basic audio reactive patch to work within Ableton live , I have added comments in the patch to try and explain on how it works.

To create a Max for Live device we first open up Ableton Live. Select Max for Live and drag in an empty Max for Live audio effect into the master channel.

Screen Shot 2016-02-24 at 21.02.04

This creates an empty audio device patch with just the plugin~ and plugout~ objects. These represent the audio coming from Ableton Live and the audio being sent to the audio output device.

When creating an audio effect, signal (which is indicated by the green and white patch chords) from Ableton is routed through a created effect and then sent to the left and right outputs.

For a Jitter patch, a copy of the audio signal is taken for audio analysis while leaving the plugin~ and plugout~ objects intact. This means that the audio will play as normal while also being sent to the relevant Jitter object for audio analysis.

Drag in a track of your choice to the same channel, which will be used for audio analysis for the creation of the jitter patch.

The Qmetro object bangs out frames per second, this is activated using the toggle and once it is selected the Qmetro starts the video. The Qmetro object has low priority properties, compared with the Metro object which triggers a bang at the set interval at all times. Meaning it will slow down in triggering bangs depending on current CPU usage, resulting in a lower frame rate. To view the frame rate , the jit.fpsgui object is used. This is attached to the jit.gl.render object.

To allow of the drawing and rendering of OpenGL, the jit.gl.render object is needed. This renders OpenGL objects to the destination window.

For a video to be rendered successfully , a message is required to allow for the erasing and drawing of frames. A trigger bang erase message is used , this first erases the existing frame , then receives a bang from Qmetro releasing a new frame and the trigger to draw the next frame, this process is then repeated.

By leaving out this message will result in the image being constantly drawn over and over again on the same frame.

To analyse the audio , the jit.catch~ object is used which transforms audio into matrices. These matrices can be seen by connecting the jit.pwindow  to the outlet of the object.

image

The next stage is to create a shape , to do this adds the jit.gl.gridshape object. This creates defined shapes such as sphere, torus , cube , plane , circle and others.

The jit.op object is added , this object is used to add the matrices from the jit.catch~ object to the gridshape. The @ symbols represent attributes for an object , in the case of jit.gl.gridshape, @Shape Sphere is added , this will automatically draw a sphere shape once the main toggle switch is pressed .

To add a menu of an attribute , click on the left hand side of the object (in this case the jit.gl.gridshape object) select shape , this will add a scrollable menu , allowing you to change to different pre determined shapes. This object is then attached to the jit.gl.mesh object, through changing attributes give you the ability to have different draw modes such as polygon, line, point, triangle, quads and line loop. These determine how the shape will be drawn. The @auto_colors attribute is added to give an array of colour onto the gridshape.

The jit.window object is added , this will create a floating screen where your visuals will be rendered.
The jit.gl.handle object is added, this allows you to move the gridshape with your mouse, it can be rotated, zoomed in and out (hold alt and click mouse), or positioned on screen (hold Cmd and click mouse)

Screen Shot 2016-02-24 at 23.55.20 Finished Max for Live Patch

In Max there is both a patching mode and presentation mode, the latter being used to create the graphical user interface in Ableton Live. Which an example of can be seen below.

Screen Shot 2016-02-24 at 22.06.55To add an object to presentation mode , just right click and select “Add To Presentation Mode” , when all relevant objects are selected then press “Alt + Cmd+E” or press the yellow highlighted button in the screenshot below, this will switch between Patching Mode and Presentation Mode. Screen Shot 2016-02-24 at 22.37.13

When in Presentation Mode , all relevant objects can be positioned above the vertical device limit. The device can be downloaded here . Press Cmd+E to unlock the patch and switch to patching mode to view this basic patch.

If you have any questions , feel free to leave a comment ,thank you for reading.

I should have another post up soon enough , please follow if interested.

The new mastering engine is here. Get that crystal clear & airy sound for your tracks.

Advertisement

Summary of my Level 7 and 8 degree Music Projects

Over the last two years my interest in digital art and creative coding has increased. Initially I started using VVVV and this was used in my Level 7 degree final year project, where an Ableton Live user could trigger both video and audio simultaneously using a MIDI device I developed on the iPad using Lemur. Assigning the same MIDI control change message to both Audio and Video both could be played at the same time using just the one laptop.

With VVVV just supporting Windows Direct X , upon purchasing a Macbook Pro , I began to learn Max MSP and Jitter , which was  used to develop my Level 8 degree final year college project in music production. Jitter was a steep learning curve for me and the attendance of a  4 day workshop on creative coding at the Digital Arts Studio in Belfast was extremely beneficial for me.  For my Final Year Project I created a max for live patch which would enable Ableton live users to have a simple visual element to their performances. Using audio analysis, the audio from Ableton Live was used to animated the selected gridshape in Max for Live. This patch was MIDI controllable using any recognised hardware MIDI controller.

Screen Shot 2016-01-15 at 20.17.21
Level 8 Music Production Project Screenshot

Over the last month , I began to work on a several pieces , which can be seen in the collage above. Computer generated art I suppose you could call it.

Thanks for reading this post , the next one will be on Electronica music production.

Max for Live Jitter Project

Screen Shot 2015-06-28 at 21.55.54

The image above is a screenshot of a project that I am currently working on. The feed from my webcam is being used as a texture , which is then applied to the mesh . The mesh allows for different draw modes , changing how the OpenGL shape is drawn. The gridshape object is also used to allow for the changing of shapes and colour (which in this case is pink). The shape is rotated onto its side for preference.

The patch is made audio reactive using the jit.catch~ object. This object transforms signal data into matrices which is essentially audio into animating the gridshape. Increasing the amplitude allows for the representation of quieter sounds on screen.

The idea of the patch is that the video content will change when someone moves in front of the webcam , creating a variation of some sort.

An Introduction

Im an electronic musician who has recently graduated from Limerick Institute of Technology . Over the years I`ve developed an interest in digital art and in the relationship between electronic music and video.
An avid user of Ableton Live , Cubase 8 , Max MSP Jitter/ Max for Live and Processing. I Intend to post screenshots, videos and audio clips of my work and blog about Music Production , Mixing and Mastering Techniques, digital art and the odd tutorial.

Feel free to follow and comment on posts , thank you for reading

The new mastering engine is here. Get that crystal clear & airy sound for your tracks.