Tuesday, May 20, 2014

Advanced Ravers and Dragons 3

What a blast! misterinterrupt performed a meltdowntown style set based on the SSTVMV tracks. Here's a recording of the perfomance:

always ready to go beyond, no other choice

Our passionate urge to freedom of communication in this internet medium is fueled partially by a collective hindsight. In light of the other major telecommunication mediums' demise during our recent century, we realize the worth of this space. Facilitation of communication via radio and television was monopolized into inaccessible, uncontrollable, one-way media.
The ideas we want to be upheld have been shouted into silence by (anti)communications monoliths. We aren't going to forget and we cannot let this continue.

Unfortunately all mediums of communication are suffering. As our bias towards use of the newest mediums climax in a convergence of convenient content for sale and concomitant laziness, we are simultaneously painting ourselves into a corner and backing up to ledge, now it's time to fly.

Wednesday, January 22, 2014

sub-strata travel via minus-viewing II

} foR immediatE releasE intO ouR neW oblivioN:

misterinterrupt substrata travel via minus viewing
} sound design, composition, mixing, mastering: m. howell
} album covers: big fingers by ken kravens, night wood by m. howell, aiieee by avid odd
} distro via bandcamp

} While a lot of folks were mashing their brains to an overdose of pitched down vocals, dissonant ipad synths and garageband reverb presets, misterinterrupt has been focused on remixing his own tracks over and over and over and over. In an intentionally anti-pop vacuum, he has concocted a noisy alchemical genre that he has dubbed guerrilla ambient using a process akin to musical necromancy.

} The process starts off by building up new tracks well into their third trimester, but before birthing, they are aborted, chopped up unceremoniously, new tracks are begun using the pieces of the old. Where many electronic artists approach their sound design with some attempt at a surgeon's precision, interrupt employs a chirurgeon's butchery. To create each successive track, he records uneven loops of multiple tracks at once, ignores common rhythmic divisions, and disregards anything other than non-systematic microtonal musicality. Each level of the remixing process follows this strictly necromantic model he calls necro techno.

} Some of the final edits have finally earned his aesthetic confidence,  released as the mysteriously named sub-strata travel via minus-viewing II. The concept behind the title is to create new sonic tools for practitioners of remote viewing. misterinterrupt seems to understand the unique geomantic, telespatial, and astral characteristics of getting your head all fucked up via deep listening, so CIA drop-outs are invited to test his new methods.

} For everyone else, presented herein are five tracks of shapeshifting pads, disfigured beats, darkly beautiful environments, and an overall movement that has an unmistakable character. sstvmv II was created specifically for assisting to create hypnagogic states in its listeners and the sounds quickly work themselves into deep sonic knots that decrypt and scramble perceptual codes etched as low frequency marks in a background of frenetic gaseous tones. The music is pulled forth via deeply sculpted psychedelic interfaces to the cochlear nerve and the rhythms keep a strange amphetamine pace.

} Though sstvmv II is intended as ambient music, it may sound like an active listening experience to some. It is obviously a complex layering of ambient and active musics. It is offered as a composition to be completed by the listener or the listeners' environments. misterinterrupt can only get one so far and suggests that listeners explore his compositions in concert with whatever inner or outer noise they might hear.

} The full album is downloadable and streamable for free. The album download includes three covers from the graphics release. You can name your price at whatever you can afford here:


Tuesday, February 19, 2013

Travis Wyche, Grisaille

::::::Oh my, today in my walkman is Travis Wyche, Grisaille. I have to just say that this has a way of driving through clouds of memories for me. The sound moves me emotionally into what seems like great realms of desert and ocean animal spirits. There is a balance of some vast esoteric concept and an unimaginable physicality in Grissaille. It expresses the wastelands and the far shores in a way that doesn't make you feel wasted, but enriched as these fantastic mythic expanses themselves. Wyche is obviously an expert at using the (analog) delay in such a way that you forget the sound is repeating. The synth and noise craft expresses a shade of dryness. The sounds are silky even in the highest frequency modulations. These sounds are like the sheering of fabric. The lush, smoothness of the longer pads are like the rest of the uncut, soft undulating sheets. They reverberate with the feeling of pleasure one receives along with sunlight.. just a few words about side B! There is so much more in this recording on side A. ::::Matty http://sanitymuffin.bigcartel.com/product/travis-wyche-grisaille-c-54

Saturday, July 2, 2011

MIDI Remote Script Design Challenges

I have been spending free or non-music time I have had on programming something similar to Hans Petrov's MIDI Remote Script for the Akai APC40, but for Native Instruments' Maschine. The Maschine did come with it's own script and controller template for Ableton Live, but I guess I want to create a better one. I want it to integrate the two workflows more, specifically during live performance.

_Framework is a component set, written at Ableton, that can be composed and extended to make Live interact with the different parts of midi controllers. The script could plausibly be simple combinations of these classes in one file or if you want more complex controller interaction, split up over multiple files in folders. So in my vision for VoidMaschine, there are multiple functions for different groups of elements on it, so there will need to be states or modes built into the way the elements work. That could possibly be over-architected to become more work than I need to so for the script, though. Someone might be drinking cool-aid, saying "Always use a design pattern, or beware the wrath of [insert cool-aid drinker's preferred language progenitor or platform demi-god's name here]!!! But, don't worry, be happy with shitty code, "it works, doesn't it?" "But is it scalable?" "I'll scale you, dude, how's that?" JK about that drama, I intend on building states into the script in whatever way such thatit'll be easier to modify later and that is appropriate for each mode/element combination.

I have confusion about the APC40_* script, particularly in the code doing something like what I want to do. For instance, some of the classes in the APC40_* source are prefixed by 'shiftable', e.g. 'ShiftableTransportComponent' and 'ShiftableSelectorComponent'. These both extend different classes. 'TransportComponent' and 'ModeSelectorComponent', respectively. This confuses me a little. I know that these are for toggle behavior for a component(pressing a modifier or 'shift' button to complete an action) AND for a 'mode' component. The second looks more like a traditional state machine, but without too much documentation, I want the class naming to be self-documenting at least, yet I also think I need to study the code more.

The VoidMaschine script has a has very similar behavior to the APC40_* script. The functionalities of Live that they control are represented by component classes that combine groups of control elements:

1 row of 8 buttons,
2 displays,
1 row of 8 knobs,
7 miscellaneous buttons,
8 group buttons,
8 transport control buttons,
3 master section knobs,
8 'mode' buttons ('scene', 'pattern', 'pad mode', etc next to the pads)
16 velocity sensitive pads

The current script does these things:

The transport controls are mapped so:
'restart' : play/restart
'play' : stop (yeah)
'rec' : record
'erase' : back to arrangement
'<' & '>' : time-line navigation
Note Repeat is mapped to tap-tempo
Master Volume is mapped to the Volume knob
Master Tempo is mapped to the Tempo knob (that's nice, it says what it is mapped to, whew)

I have 6 Group buttons blocked out for up(B) down(F) left(E) right(G) navigation of the infamous session box. While the last 2 are up(D) & down(H) for vertical scene navigation. The 'scene' mode button triggers the currently selected scene. A & C stay dark (for now...).

The pads trigger clip slots, but they don't light up to show you that they have a clip in them. I am not sure I can use Hanz' APC script as a model for setting up the launch buttons. The button class that he created utilizes different midi note values which correspond with the lights in the APC40's buttons. Maschine doesn't seem to expose any internal color or blinking state for it's buttons, so I am working on a timer-based blinking state for the clip buttons in the script. Par for the course, but the APC script has a scrolling behavior that is triggered if you hold the navigation buttons for a second or so. My experience with timers has been to use them or events that they trigger to create animation frame loops, in just about every other language I know, just not Python yet(go figure).

Finally, I have basic identification of the project name and authorship displaying on the two screens upon initialization of the script. This was an amazing epiphany for me when I discovered how to use the screens in Controller mode. I am pretty sure I was the first to do this, but it depends on how you look at it. I am simply sending messages via sysex to a certain knob and it is showing up on the screen. I created simple constants with 14, 15, and 30 spaces in order to crudely format the lcds. I want to eventually integrate this behavior into the functionality of the various components, so that I can simply trigger events to display messages, both a persistent display explaining the states and names of the parameters being controlled toast messages(quick interruptions, like the name of a mode when you press that button). I would like the persistent display to be uncluttered and only show numbers or abbreviations of the param names, but to get both the number and the full name of a parameter, you need to hold some shift button and turn the knob. When doing this sort of identification, the knobs (continuous rotaries) would not transmit their messages.

Tuesday, February 22, 2011

Reaktor Tips: Introducing the Reaktor Sampler Pack

Amazing right out of the box, er, encrypted rar. These granular ensembles are super rad! I'd like to help get the word out about them. Looks like he uses Maschine as well as Reaktor. This is an ultimate setup. The patches that Peter shipped the Mirage with are gorgeous on their own. I can't wait to melt into these clouds of sonic condensation. This precipitates some amazing new work, I think. go buy these plugins!

Reaktor Tips: Introducing the Reaktor Sampler Pack: "I'm pleased to release three new granular sampling instruments for Reaktor: Frame 2, Loupe and Mirage. Each one has its own character and p..."

Saturday, February 19, 2011

Groove-Boxing, Black-Ops, and Deranging Sound

I am finally getting back to developing the VoidMaschine script. I swear, I never wanted to stop. I am so into spending every second of my life programming, that I immediately qualify for a job at your company.. um yeah so anyway.. I moved in January and work has been a bit hectic, blah blah blah. I want to have the grid modes finished before offering the code to the Native instruments and Ableton forums so it is usable before it is even better than ever, which is what I am shooting for. Yeah, MIDI Control Script better than ever. That is the goal. That is what I am shooting for. That is the goal that I am shooting for and at. I hope that this project makes lots of groove-boxers happy.

oh yeah, the git repository is git://github.com/misterinterrupt/voidmaschine.git, nothing is guaranteed to work or even make sense, so let me know if you want to collaborate on this or want to use it and have any issues. Contact me in the comments here or wherever you find me on the stalknet.

Saturday, December 18, 2010

Simple Insight

Some of the Remote Scripts have a function called update_display.
I was scouring the various online resources for info on this, because I wanted to utilize it for making a blinking button. This is part of the 'special behavior' that any MIDI Remote Script must include as a part of itself. The script itself must implement a sort of abstract class.

In reading the sparse comments of some of the _Framework classes, I got the big idea to run this init.py script. I thought it would possibly show any expected methods in the Live module that one can import:
import Live
def create_instance(c_instance):
  for name in dir(Live):
  return None
Then I checked my log file and found stuff that reflects the LOM and also a nice little class signature formed of errors..
8713361 ms. RemoteScriptMessage: Application
8713362 ms. RemoteScriptMessage: Clip
8713362 ms. RemoteScriptMessage: ClipSlot
8713362 ms. RemoteScriptMessage: Device
8713363 ms. RemoteScriptMessage: DeviceParameter
8713363 ms. RemoteScriptMessage: MidiMap
8713363 ms. RemoteScriptMessage: MixerDevice
8713364 ms. RemoteScriptMessage: Scene
8713364 ms. RemoteScriptMessage: Song
8713364 ms. RemoteScriptMessage: String
8713365 ms. RemoteScriptMessage: Track
8713365 ms. RemoteScriptMessage: __doc__
8713365 ms. RemoteScriptMessage: __name__
8713366 ms. RemoteScriptError: AttributeError
8713366 ms. RemoteScriptError: :
8713366 ms. RemoteScriptError: 'NoneType' object has no attribute 'refresh_state'
8713367 ms. RemoteScriptError:

8713367 ms. RemoteScriptError: AttributeError
8713367 ms. RemoteScriptError: :
8713368 ms. RemoteScriptError: 'NoneType' object has no attribute 'connect_script_instances'
8713368 ms. RemoteScriptError:

8713368 ms. RemoteScriptError: AttributeError
8713369 ms. RemoteScriptError: :
8713369 ms. RemoteScriptError: 'NoneType' object has no attribute 'can_lock_to_devices'
8713369 ms. RemoteScriptError:

8713525 ms. RemoteScriptError: AttributeError
8713525 ms. RemoteScriptError: :
8713526 ms. RemoteScriptError: 'NoneType' object has no attribute 'build_midi_map'
8713526 ms. RemoteScriptError:

8713552 ms. RemoteScriptError: AttributeError
8713552 ms. RemoteScriptError: :
8713553 ms. RemoteScriptError: 'NoneType' object has no attribute 'suggest_input_port'
8713553 ms. RemoteScriptError:

8713554 ms. RemoteScriptError: AttributeError
8713554 ms. RemoteScriptError: :
8713555 ms. RemoteScriptError: 'NoneType' object has no attribute 'suggest_output_port'
8713555 ms. RemoteScriptError:

8713681 ms. RemoteScriptError: AttributeError
8713681 ms. RemoteScriptError: :
8713682 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8713683 ms. RemoteScriptError:

8713791 ms. RemoteScriptError: AttributeError
8713791 ms. RemoteScriptError: :
8713792 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8713793 ms. RemoteScriptError:

8713901 ms. RemoteScriptError: AttributeError
8713902 ms. RemoteScriptError: :
8713903 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8713903 ms. RemoteScriptError:

8714001 ms. RemoteScriptError: AttributeError
8714001 ms. RemoteScriptError: :
8714002 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8714002 ms. RemoteScriptError:

8714104 ms. RemoteScriptError: AttributeError
8714104 ms. RemoteScriptError: :
8714105 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8714106 ms. RemoteScriptError:

8714211 ms. RemoteScriptError: AttributeError
8714212 ms. RemoteScriptError: :
8714213 ms. RemoteScriptError: 'NoneType' object has no attribute 'update_display'
8714213 ms. RemoteScriptError:
... update_display continues to be called at a regularish rate
8799427 ms. RemoteScriptError: AttributeError
8799427 ms. RemoteScriptError: :
8799428 ms. RemoteScriptError: 'NoneType' object has no attribute 'disconnect'
8799428 ms. RemoteScriptError:

Monday, December 13, 2010

Friday, December 3, 2010

Using MCU protocol to communicate

I think Ableton Live does what it was made to do really well. Native Instruments Maschine does too and is now taking the innovation of live performance with pattern based sequencing to another level, tackling the hardware integration. In particular, they are all about keeping the eyes off the screen when it can be avoided (amazing direction, use the processor and shut the fucking laptop!). Now to be fair, Ableton paired with an MCU mapped control surface seems to pretty much do what Maschine is trying to do, at least conceptually, since mcu protocol is used to display dynamic control labels on controller screens (like Novation's automap stuff). I just found out about mcu and was psyched to see this Maschine midi template in the NI Maschine forums. This template uses Maschine's mcu mapping capabilities and makes short work of Ableton's interface. Neon Fucking Golden.

So, when I became aware of the _Framework scripts, first I was thinking that I wanted to contribute hanz's awesome work on the APC40_22 script that he has created in a fit of awesomeness. I basically wanted to add a 'mode' on my own and offer it to his project. But, the more I saw what he had done, I saw things that I wanted to implement and thought that I would need to learn _Framework more before shooting for the stars in my eyes. So I kept checking out how exactly the Ableton _Framework actually made the magic happen and as I messed with my APC40_22 and my Maschine, I was really wishing that I would somehow be able to use maschine's screens.

I saw the possibilities of a dj-like live performance workflow that would only require me to use the Maschine in it's various modes (and using the instance switcher). A very basic example would be to have Maschine VST on two Live tracks and be able to control Ableton and some FX and maybe some clips via the Maschine MIDI controller mode. Then I would load a track up in each Maschine, and bring the Live track down on one. I could start and mix a track in the other one. Then switch to control mode and slowly mix in the other Maschine track and create a cool transition with fx in live then mx the fx and the first Maschine track out and then continue ad infinitum.

So, I am working on my own version of the feature-lean (but not unappreciated) Maschine midi template for Ableton which works with an NI-provided MIDI Remote Script. I am trying out different things in the NI Controller Editor (changing the control mappings) and flipping through the remote scripts in the Ableton preferences, then suddenly I see text pop up on the Maschine screens. It shows up in the middle and there are two messages, one says 'Ableton Live' and the other says 'Device Offline'. This is the Mackie scripts doing this and after being very confused and trying out all sorts of stuff, I finally found out that you need only have one of the controls around the Maschine screen set to be an MCU style control in order to get these messages to show up. Then I downloaded MIDI Monitor and Sysex Librarian from snoize to find out what was happening between the script and the controller. This led me to basically capture and learn the basic MCU sysex headers and the needed messages for writing to the Maschine screens. Holy wow, I was so excited, my girlfriend took a trip to Montana to get away from me blabbing this blog post to her over and over.

Grooveboxes, sequencing, and Roland's polyphony impotency

If you have used 'groove boxes', either one of the diamonds in the rough or their annoying ilk, then you may understand just how unique the Yamaha groove-style sequencers are (including the QY series, though I never owned one). Amongst the throngs of seething 16-step rhythm and bass machine style sequencer lovers, I dream of an evolution of pattern based sequencing with scenes for morphing sets of controllers. I used to use these weekly to create experimental ambient and Reflex-like braindance music. Eventually, I sold these things because I had sort of used them up and I like change and flux and don't want my tools to define my music.

Ableton Live is actually very much like these yamaha sequencers. Similar to these pattern/section sequencers, Live has a linear 'song' mode and the ability to trigger sets of phrases which play sounds on tracks (called scenes in Live, patterns on the grooveboxes). Ableton also adds the only useful abilities of the Roland phrase samplers (SP-404, SP-808) and MPC samplers, to trigger samples and sequences live or out of sequence. Luckily, it inherited neither the horrible effects nor the ridiculously low polyphony of these Roland toys.

I have observed that the feature sets of electronic music devices are basically locked specifications. They are set per device or per application and then never really change in major ways. Too bad. Right, right, I know.. just use max/msp.. yeah totally. I love what is possible with PD and Max and Audio Mulch and Bidule and other visual programming paradigms. Though I may just be a sucker or lazy or not wanting to reinvent the wheel, I immediately gravitated towards Live when it was a baby little version 1.0. I think of Ableton to Max/msp as I do OS9 and previous Apple OS to unix. In those analogies, both unix and max were very powerful but slightly painful for those without enough experience to create and maintain custom tools while os9 was better looking, smoother, and required low effort to learn. Back when os9 was around I was baffled by unix and idolized hackers and also couldn't afford Apple computers. Now, OSX is tightly integrated with unix and it runs Max for Live. More than a win win that's a win-cubed. Blah blah blah..

Sunday, November 28, 2010

Dreaming like a true control freak

That is what I have been doing, dreaming. Oh and some drooling and a lot of searching for and digging through the net. I guess I will start a links list like Hanz over at the _Framework blog.

As an intro to this upload: There are some features that I have always wanted for the gear that I use to create my dreams in music. I am also interested in the entire composition workflow and style of live electronic music performance where these features would be helpful.

I want to detail some of my history with and interests in different gear from the perspective of how the gear was useful or not for informing an innovative live electronic music performance. Some of the things that I mean by 'live electronic music performance' is the ability to improvisationally arrange created, found or improvised composition elements expressively as well as to orchestrate sound design nuance and musicianship with pre-recorded and on the fly created phrases and samples.

Since I began creating music with the Yamaha RM1x in the mid-nineties, and on through the early aughts using the Kurzweil K2600 and the Yamaha RS7000, I dreamt of modifying the OS on these pieces of hardware. In 2005, I left Massachusetts college of art and this marked a conscious divergence from my concentration in live performance to follow a path towards gaining the experience I would need in order to one day hack, customize, create, or otherwise manipulate my reality to get what I was looking for in music performance tools. Now I am a full-time web programmer and I am starting to put my knowledge to work for music tools. I continue to create music in the studio, have done commercial audio work, and I have worked hard on my sound design and mixing capabilities.

Wednesday, October 20, 2010

Embedded Python in Ableton Live

It's late and I just decided to start posting here for all the right reasons. I haven't been able to stop hacking around with the Live Python API. I have a project that I am working on which will be a great extension for the MIDI Remote Script for any controller that has a button grid (or any buttons actually..) So far, I have read all of the articles about the framework classes on remotescripts.blogspot.com and pulled a bunch of code together to make a simple logger that will allow me to easily dump objects or do the simple trace, etc. So rad, I love python. Yeah, this is a teaser post, I hope to write more and post code soon too.