having Dan Hett
as a guest on GameFace, we’re cross blogging this informative post on live-coded visuals
. He gives you a wonderful insight on getting started as a VJ . You can see the original HERE
Dan Hett: Live Visuals 101
I’ve been meaning to write this for absolutely ages – I constantly have people asking about what I’m doing, how it works, what software I use, etc etc. This happens both online and often while I’m playing, and while I love talking about this stuff I’m starting to get sick of myself explaining it all. Aside from that, I’ve been spurred into writing for two other reasons: firstly, I’m on sort-of hiatus right now, and so I have more time to write. Secondly, I’m going to be guesting on the GameFace podcast in a few days and promised I’d put some words and pictures on the internet as visuals don’t work very well over an audio-only podcast ^_^
This post will be covering what I do, how I do it, the software I use, and the hardware that drives everything. It’s probably worth noting that as always: there is no right or wrong way to do anything. I’m far from being a pro VJ, and I’m definitely not a legendary livecoder – but I quite like operating in the middle of both disciplines, it’s a fun place to be. I throw in a bit of everything. Master of none, and all that.
In fact this advice carries over to all creative acts: if it looks/sounds/feels good to you, you’re doing it right. That’s it. Don’t let any naysaying neckbeards tell you otherwise.
OK, enough rambling. Let’s have a look at the specifics. Onwards!
- I keep things simple. Like, super super simple. Particularly with regard to the live coded aspect, I hate the sense of getting lost too much and not being able to claw things back from the brink. Same with my video setup, I prefer to have things clear and easy to use, over monstrous amounts of buttons and switches that I never use. There’s definitely a certain amount of dick-waving to be had in complicated setups, but I don’t subscribe to that macho nonsense. K.I.S.S!
- Everything fits into a backpack. This one is important, mostly as I tend to not drive to gigs and have to carry everything. I’ve considered getting bigger/more controllers and adding lights and strobes all sorts of other shit, but ultimately I value portability. There’s something satisfying about being about to drop into a venue and do cool stuff without needing to carry your own bodyweight in gear, and this of course fits in with point number one: SIMPLIFY.
- I try to record as much as possible. What this means is that I try and capture photos or quickly videos of things when they’re turning out well. Don’t forget that most of my stuff is completely improvised, so sometimes I just happen upon something that works. Grabbing a picture or something and then reviewing later really hones your sense of what works and what doesn’t. Of course when I’m doing a print stuff I capture literally everything, but that’s another story. Screengrabs are also really handy for grabbing snippets too, I constantly take random screenshots of stuff that’s looking interesting.
- I try to be ready for anything. I try to always pack at least a long VGA and a HDMI cable plus adaptors (and backup adaptors, they’re a bastard in terms of losing them or having them nicked). I’ve almost come unstuck a few times when I couldn’t connect things up, and I’ve one night that I literally couldn’t play because I was missing a VGA adaptor. Never again! I am always prepared. I pack a few other things too, will detail below. Obviously for large scale gigs I know about things in advance, but when you’re dropping in on a basement chiptune show it’s hard to know what to expect.
I tend to not freak out too much about preparing anything massively specific in advance, mostly as Algoraves or chiptune/electronica shows tend to be very unpredictable at the best of times (and this is why I love ’em!). The exception to this would be something like when I played at the Analogue Trash festival – this was much more structured as I was playing planned sets for specific bands, and most of them had particular requirements about their content and general vibe, plus practical things like logo designs and timings and that sort of thing. This is totally cool of course, and I still used a lot of live code during the sets, but the broad creative approach (and pulling together all the assets) was decided well in advance.
Generally though, I improvise, with just a broad theme or two in mind. Chiptune stuff speaks for itself really: hard lines, squares, stark colours, mega contrast, tasty glitches, and animated GIFs up the wazoo. At something more varied like an Algorave I tend to play around a bit beforehand and make written notes on some of the ideas in advance, but don’t actually do anything until I’m literally playing (which is sort of the point). For example, I’ll leave myself a note that says something literally as simple as “SNARE STROBE CUBE”, which is enough to jig me into doing a cube setup that has random internal squares that strobe and glitch out on snare hits. It’s something I’ve tried in advance, and maybe practiced a few times, but the actual implementation happens live, just based off that little trigger. Without this gentle mental prod I find it really hard to remember what I planned, it’s surprising how effect a few keywords are at jogging your memory. Of course, this also means that often what comes out is a bit different every time: good! I never live code anything I’ve memorised completely, that sort of defeats the point, right?
I suppose the only other thing here is that I try really hard to not fall back on stuff I’ve done before too much. There’s one or two visual themes that I really enjoy working with, and I can do them with my eyes shut, but I try to not use them as a safety net if I can help it. However what I do tend to do is create the same sort of thing as a starting point before sets. This is similar to the written notes in a way, and is usually just a big rotating cube, or a series of concentric circles, or some typography with a small amount of audio detection on. This is enough of a ‘thing’ to get started with, and avoids blank page syndrome a bit. I find a totally black screen and an empty command prompt to be quite an intimidating thing to start with, especially with a crowd also watching.
Live coding usually forms a big part of my visual output, with the other stuff being video and GIFs and other none-live code like Processing sketches or similar. I have two general ways of working:
In terms of specific software, my weapon of choice is undoubtedly Cyril, which is a wonderful open source openFrameworks project initiated by Darren Mothersele. I favour Cyril as again, it’s insanely simple to use: it’s just an application, some assets, and a window. It’s small enough that I can have many versions of it scattered around too, which I like: for example I have a few versions of it pre-compiled with different font sizes, or different assets in – my “algorave” cut of it contains tons of logos and typography for that specific night.
My general go-to version of Cyril is one I forked privately from the main repo, and this one contains all the bits I’ve added to it over the last year or two. None of this code is up to scratch enough to psh back into the main version, but it’s on my to-do list. Specifically my version has a much smaller font, set to Consolas, and it doesn’t change size (I find that stuff really distracting in all livecoding environments that do it). I’ve also patched in some glitch post-processing, but that doesn’t get used now as VDMX is better at it. The final thing I’ve added is split Syphon outputs: Syphon is the technology that passes frames between applications, and is how I share Cyril with VDMX. I’ve changed my version of Cyril so that the code editor window is piped out as it’s own Syphon source, meaning I can isolate the code and visuals and do separate things with them, which is really useful. Kaleidoscoped code is never not cool.
There are other livecoding environments I really like, but don’t perform with loads. I highly recommend them all if you’re into trying new things out though.
- Just a live code environment on it’s own, full-screened with my monitor mirrored to the projector. This happens at Algoraves quite often, and very occasionally at chip/other shows if I want to do something really pixelly and demoscene-y.
- A livecoding environment run through VDMX before outputting to the projector, so I can mix in video, run effects chains on both, and do all sorts of other cool things. More on specifics later. This sort of approach is my general go-to nowadays, so I can still pipe livecoded visuals through on their own, or go nuts with video and things if I need to.
If livecoding tickles your fancy, then I highly recommend starting with TOPLAP if you want to learn more. There’s also a great little forum, Lurk, that’s a useful place to post. And, more recently, we’ve set up a Slack channel for all things livecode, which is here.
I also heartily recommend checking out Algorave if you haven’t already, which will give you a taste of music and visuals created with code. There’s tons of performance videos on youtube to give you an idea, and I’ve blogged and spoken about them a few times in the past. And if you’re lucky enough to live in a town where one pops up, go. They don’t happen super often but they’re a wholly unique experience even if you don’t have any interest in the code. Take my word for it, you can dance to algorithms.
- Livecodelab is excellent, and browser based – the language is very similar to Cyril, and I think there’s been a fair bit of development crossover between the two. The thing I really like about LCL, aside from the obvious one that it works wherever Chrome does, is that everything looks really nice by default. Rather than the intimidating black screen of doom, LCL picks colours and has nice defaults for everything if you don’t specify. If you’re taking your first steps into visual livecoding, Livecodelab should be your first port of call.
- My next pick would be Fluxus, which isn’t browser based but works on everything I think. Fluxus is really, really powerful, and does tons of stuff I wish Cyril did (and might do one day). Things like shaders and 3D asset loading and a whole bunch of other cool stuff. The thing that puts me off using it live is the language, which is Racket, derived (I think?) from Scheme. I can use it, and have done in the comfort of my own home, but I’m definitely not quick or confident enough with it yet to use it live. You can definitely do amazing things with it though, highly recommend a dabble.
VIDEO / VJ SIDE
So, livecoding is still very much the primary method I use for creating visuals, but sometimes I want to kick it up a notch in intensity, or drop some effects on top of things, or do something stylistically that might be difficult or impossible to do. Or, often I might want to have a separate none-visible thing running and mix seamlessley into it, rather than having to code my way in and out of everything.
For this, and almost everything, I use VDMX. Most of the time I opt for free and open source, but for something as complex and flexible as a VJ setup, VDMX just can’t be beaten. I’ve never seen software that happily consumes and outputs so much, it’s actually astounding what people do with it. Videos in the most arcane configurations, age-old Flash animations, live feeds, Syphon inputs, friggin’ Wii remotes, the list is endless.
The downside is of course that with great power comes great responsibility, and VDMX is sometimes a very daunting application to get to grips with, particularly at the start. As always though, I keep things very very simple. I use a basic four channel mixer setup, with either one or two instances of Cyril coming in via Syphon, plus video and gifs and whatever else. The four channel mix works in a split format, so you have a left and a right channel, which both have their own left and right channels. This allows you to quickly crossfade between two master feeds, but also mix within each one.
An example might be that on the left channel I have a video playing, and meanwhile I’m working on some livecoded visuals that aren’t displaying yet. When I’m happy with them, I can crossfade to that channel and show the livecoded stuff, or mix somewhere in between. Then, if I wanted to add say, a glitch layer over the livecoded part, I could do with within the right hand channel. It sounds complicated written down, but it’s an absolute cinch to use. Again, there are monstrously complex ways to set things up, but I don’t like to complicate my life too much.
Once you have your inputs and stuff figured out, you can add effects over the top. I use two effects chains, one for each of the mixes, but again if you wanted to go totally crazy you could be more fine-grained than this. However most of my work is random and glitchy enough that this is too much, so two works well. And, of course, there’s nothing to stop you digging extra stuff out while you’re playing – I just prefer not to rock the boat, and instead favour a little setup in advance and then not poking the house of cards too much!
Effects chains are really useful for quickly completely changing the look of what you’re doing. Quite often I’ll keep the livecoded part very simplistic, knowing that I’ll do enough in VDMX to make the base source almost unrecognisable, which is fun. The risk you take with this approach is that you end up relying too much on certain effects, so it’s generally wise to not cane the same ones over and over again – although it could be argued that nobody cares, but… I care.
For example, here’s a bunch of cubes just running raw out of Cyril:
In VDMX you can then do something as simple as changing the colour:
Or, chop it up:
Or picking out the edges:
Or whatever else you can think of. All from the same source. In fact, you can even apply different effects to the same input on different channels, so you can mix between two versions of the same Cyril input, which is really useful as the animation will be sync perfectly as you crossfade. Or, if you really want your computer to set on fire, you can run two Cyril instances and do different things to them. The sky is the limit.
The other thing I quite like doing is using the livecoded element as a mask. This is a really effective way of using all that lovely audio-reactive livecoded unpredictability, but without having the graphics on screen. Instead, you can set VDMX to use that source as a mask, and what you get is a video underneath, masked out by the audio-reactive element. If you try this, make sure the geometry in Cyril is white, or close to white, for best results.
You can also do really neat audio sync in Cyril, which lets you have audio-reactive video too – this complements the livecoded aspect nicely. What this means is that you can beat match the current audio, and use it to control, say, an animated GIF. Each loop of the GIF will be matched to the rhythm and you end up with something really cool to watch. Full instructions are a bit out of the scope of this post, but the VDMX site has a bunch of learning resources if you’re interested.
I honestly barely scratch the surface of what VDMX does, but I’ve not found anything I need to do yet that it doesn’t do. I also got the starving artist discount too, which is an amazing scheme.
HARDWARE / CONTROLS
The final piece of the puzzle is controlling everything – livecoding is great and flexible and everything, but juggling doing that with controlling video etc would be a handful without external controllers. Again, my two priorities are: keep life simple, and fit it in a backpack.
For a long while I used an iPad running a customised TouchOSC interface, which was quite handy in terms of flexibility – it controlled VDMX, and I even started to add experimental support into Cyril but gave up. This was fine overall, but touch-screens are still a bit fiddly – there’s no tactile support so you have to actually look at what you’re doing, and of course touch screens and sweaty/beery fingers don’t mix well. I even tried stuff like TunaKnobs, which I kickstarted a while ago:
These are neat, but are definitely more of a toy than a serious tool really. They’re not super responsive and I kept losing them.
Nowadays I just use a simple Korg Nanokontrol 2, which is a £30 USB midi controller with a bunch of dials and buttons and faders on. There are definitely bigger and more elaborate controllers that I could have invested in, but to be honest I like the fact the Nanokontrol is so cheap. It’s a piece of shit of course, bits have fallen off mine and it’s a bit plasticky, but ultimately I only use as a master crossfader and to crossfade sub mixes, and I’ve got a few buttons to enable common effects. And, at thirty quid I don’t mind chucking it around, and when I’m done I can jam it back in my bag and be on my way. Two thumbs up.
The other key piece of hardware for me is an audio interface – obviously to get all that lovely audio reactivity you need (ideally) a clean audio source from the desk. I’ve sometimes ended up just using my shitty laptop microphone, and it’s not as bad as you’d expect really, but if you’re doing anything clever with audio you really want a line-in. I use this little Behringer interface, which lets me run different things into my machine without too much fuss (I always pack long RCA/phono cables plus adaptors, it’s hard to tell what you’ll need until you get there). Again, there are far more expensive and elaborate ones, but this cost something like £40 and it’s teeny tiny.
There’s an audio-in on my laptop too, so technically I could skip the interface a lot of the time, but the interface also lets me quickly drop/raise the gain with a physical dial, so if the music hits a loud/quiet extreme I can compensate super quickly. Physical controls win the day every time.
Not much to say here – I currently work on the world most trashed Macbook pro, a 2011 model. This poor machine has had so much punishment, and it’s absolutely fucked at this point (I am in fact writing this blog post on my work laptop at home, as my performance machine won’t turn on). I’m not a particular Apple guy, but I’ve not found a laptop as bullet-proof as this one so far: yes, it’s dead now, but the amount of punishment it took to get to this point is quite something.
I’ll be getting a new one if I definitely can’t salvage this machine, and I’m toying with putting my money where my mouth is with open source and switching to Linux full time. However annoyingly there’s a ton of OSX-only software I rely on a lot, notably VDMX! Will have to have a think, hopefully a clean wipe of this machine might possibly fix it, we’ll see.
Overall advice for laptops you perform with: don’t get too attached to it, it’s going to get dinged up a bit. Oh, and my other thing is that when playing live: shut everything down! Applications, notifications, wifi. I’ve had one or two dumbass things pop up while playing and it’s just embarrassing. Pfft.
WHAT’S IN THE BAG?
OK, so I said everything fits in my bag, and I know people LOVE to see what’s in bags, so in the name of completeness my entire setup is as simple as:
And that’s it. Pretty straightforward. Obviously if I was doing this for a living and playing massive shows and stuff then this would be laughably simplistic, but in terms of rocking up and playing small things it’s handy being able to chuck everything in a bag.
- Backpack, containing:
- 15″ Macbook Pro
- Macbook charger
- Behringer USB audio interface
- Power cord extension with six plugs (someone ALWAYS needs this)
- Korg Nanokontrol 2
- iPad mini running TouchOSC, possibly with a few Tunaknobs thrown in
- RCA cables
- Photo cables
- RCA to phono adaptors
- Long VGA cable
- Long HDMI cable
- 2x VGA to thunderbolt adaptors
- 2x HDMI to thunderbolt adaptors
- Notebook (as in, a paper one, with prompts and stuff in) and pen
- Camera ideally
- 1TB solid state drive (for holding the print captures if I’m running them)
If any or all of this has piqued your interest, here’s a grab-bag of stuff to check out. Of course I’m always available on twitter or email. (Or just get all up in my face when I’m playing live, everyone else does…)
If you got this far, thanks for reading. Peace.
Catch up on Dan’s Blog Here