Saturday, December 18, 2010

Simple Insight

Some of the Remote Scripts have a function called update_display.
I was scouring the various online resources for info on this, because I wanted to utilize it for making a blinking button. This is part of the 'special behavior' that any MIDI Remote Script must include as a part of itself. The script itself must implement a sort of abstract class.

In reading the sparse comments of some of the _Framework classes, I got the big idea to run this script. I thought it would possibly show any expected methods in the Live module that one can import:
import Live
def create_instance(c_instance):
  for name in dir(Live):
  return None
Then I checked my log file and found stuff that reflects the LOM and also a nice little class signature formed of errors..
8713361 ms. RemoteScriptMessage: Application
8713362 ms. RemoteScriptMessage: Clip
8713362 ms. RemoteScriptMessage: ClipSlot
8713362 ms. RemoteScriptMessage: Device
8713363 ms. RemoteScriptMessage: DeviceParameter
8713363 ms. RemoteScriptMessage: MidiMap
8713363 ms. RemoteScriptMessage: MixerDevice
8713364 ms. RemoteScriptMessage: Scene
8713364 ms. RemoteScriptMessage: Song
8713364 ms. RemoteScriptMessage: String
8713365 ms. RemoteScriptMessage: Track
8713365 ms. RemoteScriptMessage: __doc__
8713365 ms. RemoteScriptMessage: __name__
8713366 ms. RemoteScriptError: AttributeError
8713366 ms. RemoteScriptError: :
8713366 ms. RemoteScriptError: 'NoneType' object has no attribute 'refresh_state'
8713367 ms. RemoteScriptError:

8713367 ms. RemoteScriptError: AttributeError
8713367 ms. RemoteScriptError: :
8713368 ms. RemoteScriptError: 'NoneType' object has no attribute 'connect_script_instances'
8713368 ms. RemoteScriptError:

8713368 ms. RemoteScriptError: AttributeError
8713369 ms. RemoteScriptError: :
8713369 ms. RemoteScriptError: 'NoneType' object has no attribute 'can_lock_to_devices'
8713369 ms. RemoteScriptError:

8713525 ms. RemoteScriptError: AttributeError
8713525 ms. RemoteScriptError: :
8713526 ms. RemoteScriptError: 'NoneType' object has no attribute 'build_midi_map'
8713526 ms. RemoteScriptError:

8713552 ms. RemoteScriptError: AttributeError
8713552 ms. RemoteScriptError: :
8713553 ms. RemoteScriptError: 'NoneType' object has no attribute 'suggest_input_port'
8713553 ms. RemoteScriptError:

8713554 ms. RemoteScriptError: AttributeError
8713554 ms. RemoteScriptError: :
8713555 ms. RemoteScriptError: 'NoneType' object has no attribute 'suggest_output_port'
8713555 ms. RemoteScriptError:

8713681 ms. RemoteScriptError: AttributeError
8713681 ms. RemoteScriptError: :
8713682 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8713683 ms. RemoteScriptError:

8713791 ms. RemoteScriptError: AttributeError
8713791 ms. RemoteScriptError: :
8713792 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8713793 ms. RemoteScriptError:

8713901 ms. RemoteScriptError: AttributeError
8713902 ms. RemoteScriptError: :
8713903 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8713903 ms. RemoteScriptError:

8714001 ms. RemoteScriptError: AttributeError
8714001 ms. RemoteScriptError: :
8714002 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8714002 ms. RemoteScriptError:

8714104 ms. RemoteScriptError: AttributeError
8714104 ms. RemoteScriptError: :
8714105 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8714106 ms. RemoteScriptError:

8714211 ms. RemoteScriptError: AttributeError
8714212 ms. RemoteScriptError: :
8714213 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8714213 ms. RemoteScriptError:
... update_display continues to be called at a regularish rate
8799427 ms. RemoteScriptError: AttributeError
8799427 ms. RemoteScriptError: :
8799428 ms. RemoteScriptError: 'NoneType' object has no attribute 'disconnect'
8799428 ms. RemoteScriptError:

Monday, December 13, 2010

Friday, December 3, 2010

Using MCU protocol to communicate

I think Ableton Live does what it was made to do really well. Native Instruments Maschine does too and is now taking the innovation of live performance with pattern based sequencing to another level, tackling the hardware integration. In particular, they are all about keeping the eyes off the screen when it can be avoided (amazing direction, use the processor and shut the fucking laptop!). Now to be fair, Ableton paired with an MCU mapped control surface seems to pretty much do what Maschine is trying to do, at least conceptually, since mcu protocol is used to display dynamic control labels on controller screens (like Novation's automap stuff). I just found out about mcu and was psyched to see this Maschine midi template in the NI Maschine forums. This template uses Maschine's mcu mapping capabilities and makes short work of Ableton's interface. Neon Fucking Golden.

So, when I became aware of the _Framework scripts, first I was thinking that I wanted to contribute hanz's awesome work on the APC40_22 script that he has created in a fit of awesomeness. I basically wanted to add a 'mode' on my own and offer it to his project. But, the more I saw what he had done, I saw things that I wanted to implement and thought that I would need to learn _Framework more before shooting for the stars in my eyes. So I kept checking out how exactly the Ableton _Framework actually made the magic happen and as I messed with my APC40_22 and my Maschine, I was really wishing that I would somehow be able to use maschine's screens.

I saw the possibilities of a dj-like live performance workflow that would only require me to use the Maschine in it's various modes (and using the instance switcher). A very basic example would be to have Maschine VST on two Live tracks and be able to control Ableton and some FX and maybe some clips via the Maschine MIDI controller mode. Then I would load a track up in each Maschine, and bring the Live track down on one. I could start and mix a track in the other one. Then switch to control mode and slowly mix in the other Maschine track and create a cool transition with fx in live then mx the fx and the first Maschine track out and then continue ad infinitum.

So, I am working on my own version of the feature-lean (but not unappreciated) Maschine midi template for Ableton which works with an NI-provided MIDI Remote Script. I am trying out different things in the NI Controller Editor (changing the control mappings) and flipping through the remote scripts in the Ableton preferences, then suddenly I see text pop up on the Maschine screens. It shows up in the middle and there are two messages, one says 'Ableton Live' and the other says 'Device Offline'. This is the Mackie scripts doing this and after being very confused and trying out all sorts of stuff, I finally found out that you need only have one of the controls around the Maschine screen set to be an MCU style control in order to get these messages to show up. Then I downloaded MIDI Monitor and Sysex Librarian from snoize to find out what was happening between the script and the controller. This led me to basically capture and learn the basic MCU sysex headers and the needed messages for writing to the Maschine screens. Holy wow, I was so excited, my girlfriend took a trip to Montana to get away from me blabbing this blog post to her over and over.

Grooveboxes, sequencing, and Roland's polyphony impotency

If you have used 'groove boxes', either one of the diamonds in the rough or their annoying ilk, then you may understand just how unique the Yamaha groove-style sequencers are (including the QY series, though I never owned one). Amongst the throngs of seething 16-step rhythm and bass machine style sequencer lovers, I dream of an evolution of pattern based sequencing with scenes for morphing sets of controllers. I used to use these weekly to create experimental ambient and Reflex-like braindance music. Eventually, I sold these things because I had sort of used them up and I like change and flux and don't want my tools to define my music.

Ableton Live is actually very much like these yamaha sequencers. Similar to these pattern/section sequencers, Live has a linear 'song' mode and the ability to trigger sets of phrases which play sounds on tracks (called scenes in Live, patterns on the grooveboxes). Ableton also adds the only useful abilities of the Roland phrase samplers (SP-404, SP-808) and MPC samplers, to trigger samples and sequences live or out of sequence. Luckily, it inherited neither the horrible effects nor the ridiculously low polyphony of these Roland toys.

I have observed that the feature sets of electronic music devices are basically locked specifications. They are set per device or per application and then never really change in major ways. Too bad. Right, right, I know.. just use max/msp.. yeah totally. I love what is possible with PD and Max and Audio Mulch and Bidule and other visual programming paradigms. Though I may just be a sucker or lazy or not wanting to reinvent the wheel, I immediately gravitated towards Live when it was a baby little version 1.0. I think of Ableton to Max/msp as I do OS9 and previous Apple OS to unix. In those analogies, both unix and max were very powerful but slightly painful for those without enough experience to create and maintain custom tools while os9 was better looking, smoother, and required low effort to learn. Back when os9 was around I was baffled by unix and idolized hackers and also couldn't afford Apple computers. Now, OSX is tightly integrated with unix and it runs Max for Live. More than a win win that's a win-cubed. Blah blah blah..

Sunday, November 28, 2010

Dreaming like a true control freak

That is what I have been doing, dreaming. Oh and some drooling and a lot of searching for and digging through the net. I guess I will start a links list like Hanz over at the _Framework blog.

As an intro to this upload: There are some features that I have always wanted for the gear that I use to create my dreams in music. I am also interested in the entire composition workflow and style of live electronic music performance where these features would be helpful.

I want to detail some of my history with and interests in different gear from the perspective of how the gear was useful or not for informing an innovative live electronic music performance. Some of the things that I mean by 'live electronic music performance' is the ability to improvisationally arrange created, found or improvised composition elements expressively as well as to orchestrate sound design nuance and musicianship with pre-recorded and on the fly created phrases and samples.

Since I began creating music with the Yamaha RM1x in the mid-nineties, and on through the early aughts using the Kurzweil K2600 and the Yamaha RS7000, I dreamt of modifying the OS on these pieces of hardware. In 2005, I left Massachusetts college of art and this marked a conscious divergence from my concentration in live performance to follow a path towards gaining the experience I would need in order to one day hack, customize, create, or otherwise manipulate my reality to get what I was looking for in music performance tools. Now I am a full-time web programmer and I am starting to put my knowledge to work for music tools. I continue to create music in the studio, have done commercial audio work, and I have worked hard on my sound design and mixing capabilities.

Wednesday, October 20, 2010

Embedded Python in Ableton Live

It's late and I just decided to start posting here for all the right reasons. I haven't been able to stop hacking around with the Live Python API. I have a project that I am working on which will be a great extension for the MIDI Remote Script for any controller that has a button grid (or any buttons actually..) So far, I have read all of the articles about the framework classes on and pulled a bunch of code together to make a simple logger that will allow me to easily dump objects or do the simple trace, etc. So rad, I love python. Yeah, this is a teaser post, I hope to write more and post code soon too.