Thoughts on structuring a visual performance

relations between hardware and content/intent

The most difficult stage of preparing for a visual performance for me lately has been the process of preparing a ‘patch’ (lately a project file in VDMX) and deciding on connections between my hardware controllers (Trigger Finger, Xbox 360 controller, Wiimote), the patch, and the content (video files, animations, generative elements). Often I wish I could simply have a one click/drag connection between a particular controller and the effect or generator I wish to control: connect Wiimote to OpenEmu instance, done — not connect 7 of 15 Wiimote data sources to 6 or 12 effect parameters. Thinking about this tonight I was reminded of Lance Blisters, the audio/visual duo made up of Geoff Matters (music) and Ilan Katin (visuals), and what I learned of their working processes when I subbed for Ilan for several shows. Some aspects of that process:

  • Lance Blisters (Geoff and Ilan) chose to use the Trigger Finger exclusively to control the visuals. Unless something went wrong, Ilan would not need to touch the mouse or keyboard during the show.
  • Geoff controls the music with a MIDI guitar. After each song, he sent a MIDI command from the guitar (to his laptop > MIDI > WiFi) to Modul8 on Ilan’s computer, triggering the loading of the next song’s project file, using custom modules they wrote. Again, no mousing: just one MIDI CC command triggering the next song’s visual setup. (And one song’s visuals (Grindcore?) were entirely controlled by Geoff via the MIDI guitar.)
  • The controls for each song’s visuals were fitted to the capacity of the Trigger Finger, not the other way around. (And they chose the Trigger Finger because it has 16 pads, matching the 16 slots in Modul8’s media bin.) If a song had more than 16 media in it, a row of pads was often used as a bank switch. Chorus: tap 12 pads to switch animations in time with the music; bridge, tap one pad to switch to another bank of media (reflected onscreen in Modul8), tap the same 12 pads to bring up different media. Again, custom modules were written to make Modul8 fit the songs, not the other way around, e.g., to change the MIDI mappings on the fly, sometimes a module for each song.
  • Each song was practice-practice-practiced to get it into muscle memory.

What is there to take away from this? Obviously the last point is the strongest. Fit the software to the hardware; fit the patch to the song. Whatever you decide to do, practice the heck out of it to make it second nature, making the set tighter and freeing you to play, even improvise. What else? I’m thinking of rewriting all my qcFX (most of which are wrappers for v002 plugins — thank you, vade :)) so that they fit to my controllers, instead of experimenting with different controller mappings during shows. Maybe get more use out of the Trigger Finger’s pads by creating different ‘stab’ behaviors in the different FX/generators, e.g., use the 16 pads as a spatial grid, turning on and switching the direction of particle systems that stream from/in the four quadrants of the screen. We’ll see.

More on the One Step Beyond show soon, hopefully. In case it doesn’t happen, some thank yous: to vade for his plugins, to Momo for his “Momo particles” QTZ and his four-layer setup, to Benton, Owen, Jasmine, Reid, Emily, SeeJ, Peter, and the museum crew, to Chris Covell for his NES demos, to No Carrier for glitchNES, and to Vidvox for VDMX. 🙂

Comments are closed.