THIS IS AN OUTDATED VERSION. VIEW THE CURRENT SETUP HERE.
By Ville Hoikkala (28.12.2010)
Second revision 1.2.2011
Third revision 14.2.2011
Fourth revision 3.4.2011
Note: this article is outdated and is being redone. Drop by at the of August 2014 to find the new version.
4. Putting it all together
6. Future implementations
There are two reasons I wrote this document. First, when we started Unzyme we really couldn’t find any decent information on setting up a live show with Ableton Live that wasn’t your normal DJ-set. It was as if no-one used Live for normal, non-DJ, band live performance, or they just didn’t think of letting anyone know how to create a smoothly functioning set. The questions of how to handle situations with multiple songs, each having different midi instruments and effects, were left unanswered. Should each song be contained in a different file? Or should they all be crammed into one? If so, how to make this as resource-friendly as possible? How about arming tracks and launching songs? By trying out a few different methods, gathering tips here and there and by creating some custom hardware we finally managed to crack what I think is somewhat a smoothly running system. Hopefully this document will help fresh electro groups struggling with the same issues we did. Not to say our system is perfect – far from it. We’d love to hear suggestions on how to improve it. If anyone has solved the multiple-midi-assign-dilemma (being able to assign multiple midi commands to a single object in Live) please come forth (we’re pretty sure this could be nailed with a Reaktor or Max patch merging incoming midi commands).
The second reason I wrote this is to provide Joona and Mary (and myself) with an instruction manual in case I get destroyed. Also, I seriously need to write this down before I reach a demented stage and forget everything.
Because of the user-manual point of view, this document contains details that seem very unnecessary in the eyes of a person looking for guidance on his/her live set (such as hooking the mic to Profire 610’s input no. 1 and adjusting the gain to 2 o’clock – WHO CARES). Still, I decided it doesn’t justify making two versions of the document. Now, let’s get to it!
The unzymian live setup is like an octopus or even Cthulhu: multiple tentacles are involved, but their origin is common. In our case the instruments represent the tentacles. Cthulhu is manifested by Macbook Pro (having an evil cosmic deity as computer has many advantages). Currently everything runs through Cthulhu: vocals, keyboards, drums, backing track, VJ-information, midi controllers and the blood of nonbelievers.
[singlepic id=11 w=320 h=240 float=]
Most of the gear is housed within an old bass amplifier, which we named… The Hive! We converted the Hive to house the components and cables so that setting up the live set wouldn’t require hassling with billions and billions of cables and components stacked in random boxes. The Hive also contains the PC (Hiveputer) which runs the VJ (visual jockey) set. The Macbook Pro is laid on top of the Hive, so it also works as a table. The audio is output as two mono channels (left and right). Originally we separated each track to a different output (our soundcard allows eight outputs). It might have been better in terms of mixing, but the configuration makes it impossible to apply effects to the master channel (since there is no “master”, only outgoing channels bypassing master). The drawback of the new confurigation is that currently mixing is limited to virtual sliders in Ableton Live. We’re planning on building a wireless midi mixer or maybe utilizing the Wireless Mixer -application on the Android mobile platform.
2.1 Digital Audio Workstation (DAW) and its control interface (NVR)
Our choice of software is Ableton Live. It’s the digital core of the unzymian practical science, live and in the studio. In a live situation (pun intended) Ableton is controlled with a custom built midi controller, The Neuro Valve Regulator (NVR). An in-depth article of the NVR will be written soon, but for now let’s just stick with a few lines (which assume the reader has some knowledge of how Ableton Live works).
(The pics are of an earlier version lacking some features) NVR is an Arduino based midi controlled, housed within metal parts scavenged from a junk yard. Building the thing was necessary, because each song in our Ableton Live live-set has a different set of instruments and effects, which all need to be armed simultaneously when we want to switch songs. Doing this manually would take too much time and be rather imprecise and boring. The armings could be mapped to the keyboard or a ready-made midi controller, but then a problem would emerge with something very important: the play button. You see, we include components of each song (backing track, click track, video midi track etc..) on a single scene in Ableton’s arrangement view. A song is launched by clicking the play button on the corresponding scene on the right side of the screen. This will launch all the components simultaneously. Since there are as many play buttons as there are songs, we’d need a big array of physical play buttons as well as the arm buttons. Fitting them on a computer keyboard would be unpractical due to size limitations and problems with labeling the buttons. Having so many buttons sacrificied to “play” would be plain stupid in a normal midi controller as well . So, I figured we needed a controller that would have one play button, whose function would alter based on which song is currently armed. This is exactly how NVR works – turn a switch on the arming panel and it’ll change the variable which contains the midi message sent by the pressing of the single play button. These different midi messages are then mapped in Ableton to corresponding scenes. If nothing is armed, no midi message is sent. If multiple songs are armed, same thing (also a cool warning light will fire up).
The first image shows an overview of the set, where everything is grouped. The second pic unveils the drum group. The third pic shows the configuration of the Vocal track.
Download the Ableton Live file here. Contains Audio Units – compatibility with PCs will be dubious.
Anyone familiar with programming will enjoy a set of facepalms – the code is ugly as hell. I know things could be made better with custom functions, arrays, for- and while loops etc. However, it does its job as it is. If you wish participate by making the code more professional, contact us!
We also tried employing Max for Live, but couldn’t get Arduino Mega to co-operate well enough. In the future we’ll propably migrate to this direction though as things become more compatible with each other.
All instruments and vocals are connected to an M-audio Profire 610 external firewire sound card. The sound card will be referred to as Maukka.
Vocals are run through a standard microphone cable to Maukka’s line in no. 1 (important!). The pre-amp volume of this line should be set to 2 o’clock. Inside Ableton Live the input is picked up by the leftmost audio track labeled “Vocals”. This track is always armed. Within the track there’s an audio rack. Inside this rack are the unique settings for each song. Each time a song selection command is received from the NVR a different rack setting is applied. Most songs have very similar settings, but due to certain midi restrictions in Live each song must have its own rack. Commonly a song has a compressor and some reverb. Some songs, like Anniversaries, have special effects which are further controlled from the NVR.
The keyboards, two of them, are midi based. The bigger keyboard (referred to as Yamaha from now on) is connected with a long midi cable to Edirol UM-2 midi-usb-converter’s midi in no. 1. The signal is converted to serial and run through usb to the powered USB-hub and eventually to Mactopus, wherein the signal reverts back to midi before reaching Ableton Live.
The smaller keyboard (Smurf) doesn’t have a physical midi-phase – it’s USB all the way (though obviously the serial signal is converted to midi by the software driver in the end). The connection is made with the long black USB-B (square shaped) cord to the same hub as the UM-2. The smaller keyboard sometimes has unexplainable, possibly malicious issues – it’s important to thoroughly test it before hitting the stage.
In Ableton the keyboard tracks are layed next to each other song by song starting from the left. All tracks (about 15 of them) are contained in the Keyboard -group, which is relayed to the master channel (individuals tracks only are only sent to the group). When both Yamaha and Smurf are used in the same song, the tracks are obviously next to each other. The tracks are armed by commands from NVR.
The drums are also midi based. The drum brain (Roland TD-12) is only used to gather the signals from the pads (even though some useful settings can be applied at this point, such as sensitivity curves). The kit containing the correctly calibrated pads is kit no. 1 (in TD-12). When having trouble with random samples launched by even more random pads, check the kit number! The faders in the drums computer are negligible – they don’t do squad in our case.
The signal is sent via a midi cable from Roland TD-12’s midi out. The cable is plugged to UM-2 midi in no. 2. Just as with the keyboards, the signal is converted to serial and run through the hub, reaching Mactopus and eventually Live in pure midi form.
The VST used to launch the drum samples is Native Instrument’s Battery 3. Each song has its own track and kit, all contained within the Drums -group. The configuration is similar to the keyboards: the tracks are armed by NVR, the outputs go the group and the group is forwarded to master. The kits are built to be as lightweight as possible, each having a standard layout containing only samples which are actually used. Adjacent to each kit there are tracks named “kick” and “snare”, which obviously handle the kick and snare drum -outputs from the main Battery. When having to mix drums, these two components are most likely to be fizzled with – having them on separate tracks makes the job faster and easier.
Drum effects are controlled with a Novation Remote SL Compact midi controller. The pitch wheel has a beat-repeat working both ways (other 1/4 and the other 1/8), the mod wheel is assigned to a pingpong delay, a few knobs are assigned to effects such as sweep and deconstruct (an ableton live preset rack). I also use the Ableton looper: “record” is assigned to a key, and “stop&clear” to a key adjacent to it. This is very handy when you need to fix something while playing (happens too often on our gigs). It’s also fun to trap your beat to a loop and start fooling around with effects. This creates confusion among the crowd – is he playing the drums or is it playback? Confusion is good.
The most recent addition to the Unzyme live experience is incorporating music synced visuals to the performance. The PC running the videos (VJ-machine) is housed within the Hive. It’s synced to Mac’s Ableton Live with a one-way midi cable and the VJ-software of choice is Resolume Avenue 3. First we thought of firing short video loops by using dedicated midi-video tracks for each song, thus keeping things dynamic and awesome. But after realizing it takes ages (we don’t have enough ages) and that it’s really clumsy in every way, we decided to craft full song length videos and then just launch them by a single dedicated midi command at the start of the song. This is probably against the holy visual jockey etiquette, but screw you too! The video is output via a 10-metre long VGA cable to an old video projector. The picture is projected to a portable silver screen.
As previously mentioned, the whole organism runs on Ableton Live. In addition two other programs are needed. First is the serial-to-midi (STM) program (desktop -> Unzyme -> Applications -> Serial to midi), which is needed to convert the serial signals coming from NVR to midi – it’s the interpreter between two different digital languages. Setting up the STM is covered in section four.
The second software, OSCulator, is used to connect an Android based midi controller called Fingerplay midi to the set. This requires a WLAN, which is run by the same phone (Ville’s Nexus One). The OSCulator can also be used to connect a wiimote, but we’ve yet to find suitable use for it.
Also, the VJ-machine uses Resolume Avenue (found on the desktop of Hiveputer).
4. Putting it all together
Setting up the stage mostly relies on organized thinking, but also on a few moves based on everything but organized thinking. First and foremost, the NVR must never be hooked up to the USB-hub. Second, the NVR must be hooked up and the serial-to-midi -program must be configured before the hub is connected. A foolproof setup is explained in a few steps. This setup assumes Ableton Live or any other program is not yet running (the set needs all the CPU and memory it can get so try to minimize load by other software).
- Hook up and fire up Maukka (the firewire sound card)
- Connect microphone cable to Maukka’s line in no. 1. Set gain to 2 o’clock.
- Connect outputs to the back of Maukka. If only two tracks are being output (left and right), connect the cables to outputs 1 and 2. Third output is the click track and monitor for the drummer. Connect this to the headphone amplifier. If multiple outputs are used (usually the case when throwing a gig), the outputs must be configured in Live. The outputs to alter are those of the return tracks’.
- After making sure that all the song-chooser switches are ‘off’, connect NVR to one of Mac’s USB-ports (not the hub!). A successful launch is indicated by flashing lights on the NVR. Launch serial-to-midi (desktop -> Unzyme -> Applications -> Serial-to-midi). The program needs step-by-step configuration. First you must choose a serial port – choose the first one on the list (press A). Next you need to determine the baud-rate (the rate of communication) – use 57600. After this a midi-input port must be set – choose M-audio Profire 610 (the input port is not currently used, so choosing Maukka’s (un-used) midi in equals choosing nothing at all. So this is just a placeholder since there is no “no input” -option in the program). [nowadays we need midi in too because of the metronome in NVR. Can’t remember which port though, edit later] When requested a midi output port choose [can’t remember, edit later]. Note that the midi ports are chosen from NVR’s perspective – the output in this program equals the corresponding input in Ableton Live. A successful configuration restarts the NVR and the lights flash as they did when the device was first connected.
- Fire up Ableton Live, go to settings (CMD + ,), choose Profire 610 as the sound card (both in- and ouput) and only then open the main live-perfomance file (choosing the sound card after loading the set usually crashes everything) (something revolutions something multiple outputs whatever). Fastest way to do this is usually to choose the file from ‘recent files’. Loading the big set takes a minute or two. Next make sure only the Vocal track is armed and that the rack within it has no active ‘songs’.
- Connect the midi cables (from drums and Yamaha keyboard) to Edirol UM-2 usb-to-midi converter. Yamaha goes to input 1 and the drums to input 2. Connect UM-2 to any free USB-port (hub or no hub). Hook up Smurf (the small keyboard) to any USB-port as well. The Novation midi-controller can also be hooked up to any port.
- Getting ready now! Switch on a song from NVR and check the right tracks and vocal rack is armed. Individually check Yamaha, Smurf, Novation, drums, vocals and clicktrack. You’re done!
Sometimes weird things happen and debugging is required. Mental debugging usually provides the fastest means of repair. Just focus on each signal path – where it begins and where it should end. Think of each intermediate step and you’ll quickly discover where the signal breaks. If a few minutes of extensive brainstorming yields no results, restarting Live might do the trick (you don’t need to disconnect everything after doing this, just launch it again – the steps above are very flexible and provide just one, though a foolproof way of setting up the Machine). There might also be an issue with Maukka’s settings. Open up Maukka’s control panel from Mac’s settings and see if something’s wrong. In extreme cases the Mac can be restarted.
Any questions? Or better yet, suggestions on improving the set or this article? Write a comment below.
- review “putting it all together”, and perform a test on an lab animal
- more details on VJ stuff
- pics of Live file, make it downloadable
- specify details of STM, OSCulator and Fingerplaymidi
- specify debugging (especially Maukka’s settings)
- list future implementations and current thoughts of concern