danwinckler.com/visuals


Recently my friend and collaborator Josephine Dorado invited me to speak with her Social Media Mashup class at The New School, which I was delighted to do. The conversation (on Seesmic) ranged across a variety of topics, including improvisation vs. structure (in my work and live visuals in general), visual performance tools, and how my own background as a theater/comedy actor has impacted my live visual work. Now that the class is moving on to another topic, I’m posting this here for anyone who’d like to keep the conversation going, e.g., ask me all the heavy, technical questions I said I’d answer later. 😉 Thanks again, Josephine, Barb, Tom, Debbie, andrihatesjazz, Cecelia, Antoine, Rick and all you lurkers. Happy holidays!

Last month Glomag and I kicked off a new collaboration with a show at XRAY NYC, a new monthly burlesque/magic/awesomeness show put on by those who organized the late Freaks ‘n’ Geeks party. Some of the video Emily (my steadfast documentarian) shot will soon make its way online but, in the meantime, I hope you like these shots Asif Siddiky of 2 Player Productions took as much as I do. Thanks, Asif!

Glomag

Glomag and me

Je Deviens DJ En 3 Jours

relations between hardware and content/intent

The most difficult stage of preparing for a visual performance for me lately has been the process of preparing a ‘patch’ (lately a project file in VDMX) and deciding on connections between my hardware controllers (Trigger Finger, Xbox 360 controller, Wiimote), the patch, and the content (video files, animations, generative elements). Often I wish I could simply have a one click/drag connection between a particular controller and the effect or generator I wish to control: connect Wiimote to OpenEmu instance, done — not connect 7 of 15 Wiimote data sources to 6 or 12 effect parameters. Thinking about this tonight I was reminded of Lance Blisters, the audio/visual duo made up of Geoff Matters (music) and Ilan Katin (visuals), and what I learned of their working processes when I subbed for Ilan for several shows. Some aspects of that process:

  • Lance Blisters (Geoff and Ilan) chose to use the Trigger Finger exclusively to control the visuals. Unless something went wrong, Ilan would not need to touch the mouse or keyboard during the show.
  • Geoff controls the music with a MIDI guitar. After each song, he sent a MIDI command from the guitar (to his laptop > MIDI > WiFi) to Modul8 on Ilan’s computer, triggering the loading of the next song’s project file, using custom modules they wrote. Again, no mousing: just one MIDI CC command triggering the next song’s visual setup. (And one song’s visuals (Grindcore?) were entirely controlled by Geoff via the MIDI guitar.)
  • The controls for each song’s visuals were fitted to the capacity of the Trigger Finger, not the other way around. (And they chose the Trigger Finger because it has 16 pads, matching the 16 slots in Modul8’s media bin.) If a song had more than 16 media in it, a row of pads was often used as a bank switch. Chorus: tap 12 pads to switch animations in time with the music; bridge, tap one pad to switch to another bank of media (reflected onscreen in Modul8), tap the same 12 pads to bring up different media. Again, custom modules were written to make Modul8 fit the songs, not the other way around, e.g., to change the MIDI mappings on the fly, sometimes a module for each song.
  • Each song was practice-practice-practiced to get it into muscle memory.

What is there to take away from this? Obviously the last point is the strongest. Fit the software to the hardware; fit the patch to the song. Whatever you decide to do, practice the heck out of it to make it second nature, making the set tighter and freeing you to play, even improvise. What else? I’m thinking of rewriting all my qcFX (most of which are wrappers for v002 plugins — thank you, vade :)) so that they fit to my controllers, instead of experimenting with different controller mappings during shows. Maybe get more use out of the Trigger Finger’s pads by creating different ‘stab’ behaviors in the different FX/generators, e.g., use the 16 pads as a spatial grid, turning on and switching the direction of particle systems that stream from/in the four quadrants of the screen. We’ll see.

More on the One Step Beyond show soon, hopefully. In case it doesn’t happen, some thank yous: to vade for his plugins, to Momo for his “Momo particles” QTZ and his four-layer setup, to Benton, Owen, Jasmine, Reid, Emily, SeeJ, Peter, and the museum crew, to Chris Covell for his NES demos, to No Carrier for glitchNES, and to Vidvox for VDMX. 🙂

Here’s a small excerpt of my visuals with Dam-Funk at One Step Beyond last night. Details tomorrow. Thanks, Benton!

You will need to install or update the Flash Player to see this video. It’s free and only takes a minute.

Tonight I’ll be doing a set in the “VJ battle” at MediaLounge, a new festival at Grace Exhibition Space with a ton of cool-looking video installations. Come enjoy the open bar whilst I brain VJs with my pixel mace.

This Saturday, I’ll be jamming with CJ (seej.net) and Peter Shapiro at the Love Your Lane Ride after-party, a benefit for Time’s Up, the environmental, alternative-transportation (bikes) group, at an enormous, indoor skate park called the Autumn Bowl. I shall paint its mammoth walls with hearts and spades. And bikes (Excitebike FTW!).

p.s. I’ve put up one video from Saturday’s show — more to follow

Tonight:
MediaLounge
at Grace Exhibition Space
840 Broadway, Brooklyn time: 6 pm (festival starts), 11pm (VJ battle)
free

Saturday:
Love Your Lane Ride after-party
at the Autumn Bowl
73 West St, Greenpoint, Bklyn time: 8:30pm
$9.99 door