Collected here are projects on various software platforms - mainly Max/MSP/Jitter but occasionally others - available for download. Some of these projects are available for free and for some I ask a small donation (min - PWYC) for the time involved in making them accessible/legible.
You may use them however you like, and I would love to hear about any applications you find for them or requests you may have for future versions. You can send me a link or message on the contact page here.
This patch is modeled on the process described in this tutorial by Samuel Pearce-Davies. It is essentially a functional single neuron, the basic building block for a potentially complex, Max-based neural network.
I built this patch as a first foray into the concepts behind machine learning, and have done my best to keep detailed notes (I got lost quite a few times over the course of its construction). The processes for running the neuron as well as summaries of its internal operations are included as comment blocks within the patch.
I would love to see an abstraction or evolution of this patch, and may post additions in the future. Enjoy!
patch available for donation soon
Trivelope is a 3 band envelope follower with both normalized and scalable outputs as well as a gate/trigger system. It is an all-in-one workstation for using audio to induce dynamic change across Max/M4L patches.
1 lowpass and 2 bandpass filters split audio into 3 signal values. The signals can be gainscaled individually before and after signal-to-float conversion. You can also add an offset and apply a scalar to adjust the result to a desired range. The rise/fall slopes of each envelope can also be set individually.
Trivelope has 9 outputs:
3 scaled values - one for each scaled signal.
3 raw values from 0. to 1. - useful for interacting with Vizzie/BEAP.
3 triggers that bang when the signal crosses a threshold.
Enjoy! Please contact me if you have any questions or requests. This patch is an ongoing project so I will likely update it for future releases.
P.S. you can instantiate this patch using [bpatcher trivelope_3] after downloading to use it in presentation mode format.
This is a simple patch with a 4-operator FM algorithm. 3 oscillators modulate one another in sequence, and the resulting waveform modulates a carrier wave of specifiable frequency.
This was one of my early experiments in MSP and I've cleaned it up so that it may be a useful patch in learning how to produce FM synthesis. Any resemblance to Dexter's Laboratory is purely coincidental.
A device that accommodates my style of Ableton Push use within the Max context; it turns the top row of buttons and encoders into an independent MIDI control interface for Max.
I usually keep a keyboard on top of my Push 1 pads to save space while using Max, and developed a patch that utilizes Jeff Kaiser's jk.push package to form a simple bridge between the device and Max without using Ableton Live itself. Note that this package is required for the Push Palette to work.
This bpatcher uses the 8 rotary encoders on the top row of the push, the master rotary encoder, and the 2 tempo/swing encoders on the left of the top row. It also uses the master track button and the track mute/select buttons as gates and triggers respectively AND the ribbon controller strip as an individual output. Hovering over the outputs of the bpatcher or investigating the unlocked patch will identify each output and its type. The controls interact with the Push's color codeable LEDs, and there are also options to blackout the Push lights or reset all values if desired.
I haven't tested it with the Push 2, but odds are if there are any issues, the tinkering will be very basic reordering of patch cords. Let me know if it works for you!
This patch is based on one of Federico Foderaro's Amazing Max Stuff tutorials. These tutorials are a free and helpful way into Max and Jitter, and I cannot recommend them enough.
The patch performs a shader-type operation on a GL texture. First, it is converted into a Jitter matrix so that edge detection can be performed. Then, the luma values are used to create a vertex map of the video frame so that a network of lines, points, or shapes can be drawn over the frame. Finally, the matrix is converted back into a texture so it can interact with other modules in the VIZZIE library. The "visible" control allows you to peek at the matrix inside the module and check for bugs or differences from output.
Be advised: making a switch from texture to matrix AND back again is a computationally expensive process. I'm still working on an alternate method, but results can still be easily generated with this abstraction if you're handy and sparing with its use.
Playing with the settings can yield a pretty broad range of effects. Let me know if you find it useful. Enjoy.
patch available for donation soon
TextGen is a bit of an oddity. It is an all-in-one workstation for developing and rendering text, typography, and animations with a healthy dose of randomness à la freudian slip slipped in. The applications are endless, or maybe not. Depends on your ego.
In the center of the device are a main and secondary textedit interface. These can be displayed and interacted with in an openGL context directly (with the "visible" control) or output for use with other Max/VIZZIE objects. There is a twist though...
The main and secondary text boxes can either be displayed as block text or word-by-word. The probability of which text will be rendered is governed by a chance process. You can affect the balance by adjusting the "%id..." control. Excellent for subliminal messaging experiments.
If you'd like to play with randomness-driven text experiments, there is an experimental feature that outputs transcripts of the most recently displayed text.
The rendering is driven using bang messages, either using an internal/external clock or manually with the "speak" button. There are buttons to reset object transforms, object defaults and transcript memory.
As you might have guessed, this project is fairly niche and an ongoing interest experiment. Nevertheless, it has taken a fair amount of design and time, so I do ask for a small donation. If you have more questions or would like to know more about the device, please contact me here. Enjoy!
jit.physgui is a one-stop interface for controlling Max's jit.phys.world object on the fly. It's simply a set of controls I find myself adjusting constantly while running physics simulations in Max, and so decided to compile into a little snippet with a handy GUI.
jit.physgui contains controls for gravity, worldbox scale, and plane removal on the xyz axes. It also contains on/off switches for the whole simulation, the worldbox, collision reporting, and rigid body dynamics. One can also use the device to set simulaiton resolution (in Hz) and the maximum number of substeps calculated between frames. There is also an option to save physworld presets.
Using jit.physgui allows one to communicate directly with the internal jit.physworld object inside a jit.world, sidestepping the need to use a separate context or compile messages individually and perform a prepend. Nothing special, just convenient. Enjoy!
more projects to be made available soon...