Friday, December 3, 2010

Using MCU protocol to communicate

I think Ableton Live does what it was made to do really well. Native Instruments Maschine does too and is now taking the innovation of live performance with pattern based sequencing to another level, tackling the hardware integration. In particular, they are all about keeping the eyes off the screen when it can be avoided (amazing direction, use the processor and shut the fucking laptop!). Now to be fair, Ableton paired with an MCU mapped control surface seems to pretty much do what Maschine is trying to do, at least conceptually, since mcu protocol is used to display dynamic control labels on controller screens (like Novation's automap stuff). I just found out about mcu and was psyched to see this Maschine midi template in the NI Maschine forums. This template uses Maschine's mcu mapping capabilities and makes short work of Ableton's interface. Neon Fucking Golden.

So, when I became aware of the _Framework scripts, first I was thinking that I wanted to contribute hanz's awesome work on the APC40_22 script that he has created in a fit of awesomeness. I basically wanted to add a 'mode' on my own and offer it to his project. But, the more I saw what he had done, I saw things that I wanted to implement and thought that I would need to learn _Framework more before shooting for the stars in my eyes. So I kept checking out how exactly the Ableton _Framework actually made the magic happen and as I messed with my APC40_22 and my Maschine, I was really wishing that I would somehow be able to use maschine's screens.

I saw the possibilities of a dj-like live performance workflow that would only require me to use the Maschine in it's various modes (and using the instance switcher). A very basic example would be to have Maschine VST on two Live tracks and be able to control Ableton and some FX and maybe some clips via the Maschine MIDI controller mode. Then I would load a track up in each Maschine, and bring the Live track down on one. I could start and mix a track in the other one. Then switch to control mode and slowly mix in the other Maschine track and create a cool transition with fx in live then mx the fx and the first Maschine track out and then continue ad infinitum.

So, I am working on my own version of the feature-lean (but not unappreciated) Maschine midi template for Ableton which works with an NI-provided MIDI Remote Script. I am trying out different things in the NI Controller Editor (changing the control mappings) and flipping through the remote scripts in the Ableton preferences, then suddenly I see text pop up on the Maschine screens. It shows up in the middle and there are two messages, one says 'Ableton Live' and the other says 'Device Offline'. This is the Mackie scripts doing this and after being very confused and trying out all sorts of stuff, I finally found out that you need only have one of the controls around the Maschine screen set to be an MCU style control in order to get these messages to show up. Then I downloaded MIDI Monitor and Sysex Librarian from snoize to find out what was happening between the script and the controller. This led me to basically capture and learn the basic MCU sysex headers and the needed messages for writing to the Maschine screens. Holy wow, I was so excited, my girlfriend took a trip to Montana to get away from me blabbing this blog post to her over and over.