danwinckler.com/code


First things first: Open Emu v1.0.0b2 is now available, featuring a massively refactored architecture, refined user interface, several new emulator cores (GBA, Genesis and SNES), and the first public release of the Quartz Composer plugins. Go get it at openemu.sf.net. :D Let us know how you like it and please do send us video, screenshots, bug reports, and tag your stuff on other sites with “openemu” so we can find it.

Open Video Conference

Although I missed most of the conference, what I did experience at OVC on Saturday and Sunday was great: met many interesting people, ingested tasty ideas and saw some great demos. More on this soon. :)

Here’s a rough demo video of Open Emu I whipped up for the conference despite intense sleep deprivation.

<br/>

Open Emu demo from Dan Winckler on Vimeo.

Jan
14

Update: June 22, 2009

Open Emu version 1.0.0b2 is now available, featuring a massively refactored architecture and several new emulator cores. Go get it!

the Open Emu logo

Open Emu, an application I helped develop, is now available for download at SourceForge.net. Here’s my part of the story.

image: a screenshot from my visuals

At the first Blip Festival in 2006, I generated some of the visuals with jit.atari2600, a plugin for Max/MSP/Jitter (my platform at the time) that encapsulated an open source Atari emulator. Jit.atari2600 was buggy so I quit using it in performance, but the idea of encapsulating an emulator and ‘bending’ it in software — as my friends noTendo and No Carrier do with hardware — stuck with me. Early last year, I began looking for an open source Nintendo emulator and learning Objective-C/Cocoa in order to try making an emulator plugin myself. I found Open Nestopia, an open source, Cocoa-based port of the fearsomely thorough and accurate Nestopia emulator by Martin Freij, and started work on the plugin during my residency at the Experimental Television Center. I contacted Open Nestopia’s developer Josh Weinberg who generously, patiently and kindly helped me get the app to build and get a sense of his code and what to do with it. Then I got really, really busy with other things and shelved the project until August when, with the help of Josh and Anton, I got the plugin to build and run in QC.

Anton joined the project — which in the meantime Josh had transformed into Open Emu, a framework for multiple emulators (NES, Atari, Sega, Gameboy) — and development really took off. Now, five months later, our first beta release of Open Emu is live on SourceForge and the Quartz Composer plugins are in private beta and soon will be public. I couldn’t have learned this much and brought the plugin this far this quickly without the overwhelmingly generous help of Josh and Anton especially, as well as all the other friends and developers who’ve patiently answered my noob questions these many months. Thank you Josh, Anton, Eric, Ben and everyone else.

Gamers! If you’d like to play your favorite old school games on Mac OS X, download the beta and give it a whirl. It’s still got some bugs so we’d very much appreciate your feedback.

Visualists and hackers! Stay tuned to the Open Emu site for our Quartz Composer plugins, coming soon.

**Oh, and if you’d like to see me use Open Emu in a show, come out to 8static in Philly next month.

p.s. We’re having a private beta on the plugins right now. If you’d like to try them, let me know. Note: you must have the Leopard Developer Tools (and thus Quartz Composer) installed for these to be useful.

May
25

It’s Fleet Week, apparently. Partying sailors abound. I was to see two shows tonight but the first one was so enthralling that I missed the second. So it goes. Adam Kendall did visuals for Roger Eno and Plumbline at Tonic: lovely music paired with absolutely brilliant visuals. Adam’s approach is very painterly and moving on a gut level. Since I saw his work for the first time two years ago, his craft has gotten better and better. Misty, melting, mnemonic melanges of powerful, personal films — see? Words don’t do it justice. Watch his Case Studies, which are fairly close to what he did tonight.

It was really cool to see a great pianist like Roger Eno play. He had a delicate touch and phrasing, well-placing his lines in Plumbline’s laptop work. He showed how you could improvise just outside the tonal structure of a (seemingly) fixed set of tracks, which is something that had stumped my imagination a bit when thinking about how to play piano in a Share jam with similar laptop musicians. And he watched Adam’s visuals closely. Thumbs up.

Aside to Adam: are you putting out 320 x 240? I’d love to see your stuff in higher res. Good reason to start incorporating those GPU shaders… :)

The show I missed was my friend Eli’s, which I wrote about earlier. Ah, well — next time (which is just what Eli said). He’s going on a solo tour this summer, hitting LA, Vancouver, Buffalo, and other places I can’t recall. If you like the tracks on his myspace and you know someone with a venue in the lower 48, drop Eli a line — he’ll probably be interested.

Challenges

Adam and Anton’s approaches seem similar and complementary to me. I hastily scribbled an idea that came to me during the show: challenges. I’d like to give collaborative challenges to my fellow/favorite visualists, e.g., swap: Adam and Anton doing a duo show with their current setups (god’s eye and vade, respectively). Both of them predominantly use a library of video clips that are both personally meaningful and formally interesting, which they know and have practiced well. Now swap their libraries and let each other decide which clip the other will use next. Connect them with an Ethernet cable and a very simple Max patch to streamline the process. The patch notifies them when a video’s been selected and previews it so they can prepare to slip it in.

Regardless of whether A and A would dig this idea, it’s the kind of collaborative ‘game’ (or structure or form) I’d like to explore more. Rather than focus on the technical aspects of current and future video mixers, which seems to snag us all up when we talk about visual jams, I’d like to see my fellow visualists play games with each other like this. And I’d like to build simple Max patches — and potentially KeyWorx plugins, in the new version of KeyWorx that’s on the table for the 2nd phase of Kids Connect — to aid these games. Thoughts?

Kids Connect dev

Speaking of Kids Connect, we had a really good meeting today that cleared up a lot of the questions Josephine and I had, e.g., the level of supervision needed, if/how many student teachers we’d have to help teach, when we’d get funds released to start work in Second Life, and more. Plus we were joined by Dr. Garey Ellis, who heads the Promise Fund’s Inner Force program. Not only did he have valuable insights and suggestions for KC, he reminded us how new this kind of work (online collaboration, visual performance, creative uses of consumer technology) is, and how exciting it will be for the workshop students and parents. It feels really good to be sharing my knowledge outside of the relatively narrow improv comedy world.

Apr
26

O, to be asleep instead of writing at 2:11 AM. O, alack this brain of mine.

The long and short, quickly: got back from Texas on Sunday. Four of us Share people went down to setup a Share jam at the Media Archaeology Festival at Aurora Picture Show. Went great, saw and met many wonderfully sweet people. Pics to come on the Share site.

The big news: my thesis has changed. A great opportunity fell in my lap; things folded together. I sat in on a net conference call two weeks ago and whiff-boom-bang! I’m suddenly co-organizing and teaching Kids Connect, a series of summer workshops for kids in theatrical and technological collaboration, sponsored by ZoomLab, the Waag Society and Polytechnic University. It brings together a lot of my interests and goals — it gelled quickly, a total no-brainer. One thing I brought to the table is the still relatively untapped potential for education and performance in Second Life and it’s that which is keeping me up brainstorming right now. What would make a compelling indigenous performance in Second Life? That is, one that is not virtual set dressing for a real life performance but a truly virtual performance that couldn’t be done in meatspace.

Wandering through SL, I’m struck again and again by how meatoid it is. Virtual human bodies walking on two legs and seeing with one eye. Houses with four walls as if there were a need for load-bearing members. It seems to me that an indigenous, exciting Second Life performance ought to be code-intensive (possibly generative), interactive, and transformative — literally body-changing — warping your avatar, multiplying and distributing its eyes and ears. Kaleidescopic eyes as big as houses. Think a live machinima of The Matrix: The Musical! with Vishnu as Neo, dance numbers choreographed by the mutant child of Busby Berkeley and Chris Cunningham). Hopefully this silly hyperbole sounds more exciting to you than listening to streaming audio while watching a jerky animation of a guy playing a guitar.

Not that I’m no longer into Real Life/Second Life performances, of course.

Maybe I can get to sleep now.

p.s.

Inworld I’m Dan Magpie and here’s a link to my land.

Apr
7

I added record functionality*, drag ‘n’ drop movie loading, and made the camera start process a bit simpler. Not too much really, but it’s a better patch. :) Let me know if you use it — I’d love to see screenshots or movies of your work.

brush-0.1.2.png

Get brush 0.1.2

* from a patch called Simple Mix by Peter Nyboer in the Jitter examples folder. Thanks for the great patch, Peter!