Big news: Emily and I are engaged. 🙂 Two weeks ago while we were upstate apple-picking with some friends, I proposed and Emily said yes. And before you ask, no, we haven’t set a date or a place yet. 🙂

image: emily and i getting engaged. Flickr set: apple pickin’

This trumps all other news but it doesn’t get me out of doing a proper post. What else? I’ve been very (good) busy with work for the past couple months, beginning with a major project for us: the audio, video and lighting design for an art installation by William Pope.L at Hauser & Wirth gallery. We spent large chunks of August and September collaborating with Pope.L on the design and the producer Adi Nachman on making it all happen (both of whom were great to work with), and installing everything at the gallery. The installation closed on Saturday, October 24th, after a successful five week run, and I’m told that attendance was usually about 100 visitors/day and much higher on peak days.

relations between hardware and content/intent

The most difficult stage of preparing for a visual performance for me lately has been the process of preparing a ‘patch’ (lately a project file in VDMX) and deciding on connections between my hardware controllers (Trigger Finger, Xbox 360 controller, Wiimote), the patch, and the content (video files, animations, generative elements). Often I wish I could simply have a one click/drag connection between a particular controller and the effect or generator I wish to control: connect Wiimote to OpenEmu instance, done — not connect 7 of 15 Wiimote data sources to 6 or 12 effect parameters. Thinking about this tonight I was reminded of Lance Blisters, the audio/visual duo made up of Geoff Matters (music) and Ilan Katin (visuals), and what I learned of their working processes when I subbed for Ilan for several shows. Some aspects of that process:

  • Lance Blisters (Geoff and Ilan) chose to use the Trigger Finger exclusively to control the visuals. Unless something went wrong, Ilan would not need to touch the mouse or keyboard during the show.
  • Geoff controls the music with a MIDI guitar. After each song, he sent a MIDI command from the guitar (to his laptop > MIDI > WiFi) to Modul8 on Ilan’s computer, triggering the loading of the next song’s project file, using custom modules they wrote. Again, no mousing: just one MIDI CC command triggering the next song’s visual setup. (And one song’s visuals (Grindcore?) were entirely controlled by Geoff via the MIDI guitar.)
  • The controls for each song’s visuals were fitted to the capacity of the Trigger Finger, not the other way around. (And they chose the Trigger Finger because it has 16 pads, matching the 16 slots in Modul8’s media bin.) If a song had more than 16 media in it, a row of pads was often used as a bank switch. Chorus: tap 12 pads to switch animations in time with the music; bridge, tap one pad to switch to another bank of media (reflected onscreen in Modul8), tap the same 12 pads to bring up different media. Again, custom modules were written to make Modul8 fit the songs, not the other way around, e.g., to change the MIDI mappings on the fly, sometimes a module for each song.
  • Each song was practice-practice-practiced to get it into muscle memory.

What is there to take away from this? Obviously the last point is the strongest. Fit the software to the hardware; fit the patch to the song. Whatever you decide to do, practice the heck out of it to make it second nature, making the set tighter and freeing you to play, even improvise. What else? I’m thinking of rewriting all my qcFX (most of which are wrappers for v002 plugins — thank you, vade :)) so that they fit to my controllers, instead of experimenting with different controller mappings during shows. Maybe get more use out of the Trigger Finger’s pads by creating different ‘stab’ behaviors in the different FX/generators, e.g., use the 16 pads as a spatial grid, turning on and switching the direction of particle systems that stream from/in the four quadrants of the screen. We’ll see.

More on the One Step Beyond show soon, hopefully. In case it doesn’t happen, some thank yous: to vade for his plugins, to Momo for his “Momo particles” QTZ and his four-layer setup, to Benton, Owen, Jasmine, Reid, Emily, SeeJ, Peter, and the museum crew, to Chris Covell for his NES demos, to No Carrier for glitchNES, and to Vidvox for VDMX. 🙂


I voted. And I volunteered as a videographer for Video The Vote, going to polling stations in Brooklyn where voters had reported problems or obstruction. Pretty minor stuff: at my polling station and another nearby, they were requiring ID, which is not required in New York state; at another, a poll worker refused to take off her Republican campaign button. When I showed up and walked through, there were no buttons to be seen, so I drove all the way over there for nuttin. Drat.

Also, I posted about the public beta release of the v002 Rutt/Etra software synthesizer on CreateDigitalMotion.com. I’ve been an alpha tester and helper on the project, even putting my meager Illustrator skills to use recreating the RE logo. It’s a beautiful tool — check it out. 🙂

And I picked up my girlfriend from the airport, who was down in North Carolina getting out the vote. Now she’s glued to television, radio, Twitter and NYTimes.com while I make dinner. Ah, domestic bliss. Later we might go by 3rd Ward‘s election party to agonize over the returns among like-minded Brooklynites.

My brother’s probably already seen this on BoingBoing, but I wanted to post this video here for my parents. Having got themselves a Wii for Christmas, I hope they’ll appreciate the technical side of this hack as well as its artistry and nostalgia-hacking glee.

Found on BoingBoing

Last couple thoughts for the night:

  • i’ve finally got time to play around with a media-textured avatar (think video skin)
  • i want to make an avatar (in SL) that moves like Duchamp’s Nu qui descend un escalier #2 (nude descending a staircase #2)
  • I learned a lot about texturing from the SL fora today. Check the end of the GIMPshop texturing tutorial I wrote last week for some illuminating reading on the subject.
  • i made a couple ringtones last week — one of the tiny projects I squeezed in to entertain myself after long hours of teaching — in GarageBand and easybeat was very handy in the mastering stage*. Recommended.

* that is, playing it on my phone and transposing it in nudges until it sounded good.