When we started talking
about The Revenger's Tragedy over two years ago, we knew the
basic feel of the show. Everything would be synthetic, detached in some way
from the equivalent real-life version. The lighting would be reminiscent of
club lighting, and the music would be electronic. We also knew that we wanted
to link the control of lighting and sound via MIDI Show Control (MSC.)
At first, I attempted to do this using Stage Research's SFX software alone. I'd used SFX many times before, but I'd never attempted to make changes in the music sound perfectly musical; that is to say I've cross faded between pieces of music with SFX, but it's nearly impossible to make that change imperceptible to the audience. Because the music we chose for Revenger's is electronic, it was incredibly important to never lose the "four on the floor" house beat. Even a person without any musical training can tell if the music gains or loses even a fraction of a musical measure. With real musicians, a little variation in timing is more acceptable, and is sometimes integral to the sound of certain musical genres.
In designing a show to run on electronic equipment, i.e. designing any show that is not going to be performed live by a band or ensemble, is sometimes like working backwards. If the music was performed live, we could have just told the musical director/conductor to "vamp" certain sections of music under an actors lines. This is a common occurrence in musical theater. Even if a musical theatergoer has never heard the term "vamping," they know exactly what it means upon description of the idea. With a live band, sections to be vamped would be indicated in the score. With computer playback, the end result is similar, but the method is obviously much different.
So the goal
became to play and then vamp sections of the music. It quickly became evident
that the SFX software was completely unsuited to this task,
especially since the nature of theater is evolutionary. To create the desired
product using SFX, I would have needed an immense amount of
time for programming, editing and troubleshooting. SFX sees
audio as simply files full of data, and doesn't know to treat four bars of
music any differently than it would treat a non-musical sound, like the sound
At this point, I had decided to use two computers to run the sound rig. One would run SFX and one would run Live 4. The SFX machine was responsible for dealing with the triggering of all cues, both lighting and sound. There was a main cue list that the operator triggered, and a second cue list full of light cues which needed to be linked to sound events. Now, the sound operator could easily trigger a sound cue in Live and a lighting cue at the exact same time. This would mean that if Live received a GO message, it was programmed (in most instances) to wait for the downbeat to execute the change in music. This way we avoided any musical oddities in timing, like a measure of music cut short, or a measure that seemed to repeat when it wasn't supposed to. However, if SFX was sending it's lighting GO message and a Live GO message at the same time, lights could potentially trigger early by as much as nearly a full bar of music.
The answer came from
Live's ability to handle MIDI as well as audio. Introduced with version 4, Live
can host virtual instruments such as synthesizers and samplers, but it can also
send MIDI data out of any MIDI output available to the host computer. This
feature, combined with the ability to program SFX to listen to
a MIDI port and trigger certain cues when it receives specific MIDI messages
made the perfect synchronization possible.
|The final control chain worked as follows: The operator presses GO in SFX, triggering a sound cue in Live. Live receives the command, and waits until the next musical appropriate point in time to execute the GO. When the next preprogrammed opportunity presents itself (no more than 1 bar of music), Live takes it's next cue, and at the same time sends a MIDI message back to the SFX computer. SFX hears the confirmation message and sends the appropriate GO message to lighting, triggering the light cue that corresponds to the change in the music.|
As soon as
this setup occurred to me, the rest was really just troubleshooting.
Limitations in the ETC Express Lighting Console and in the JandsHOG Moving
Light Console required a MIDI patch bay. The patch bay routed MIDI between the
SFX computer and lighting control. Bi-directional MIDI
communication was needed between the rigs so that MIDI data could be both sent
and received by both rigs, and from both rigs. Theoretically, this could be
done with a multi-port MIDI interface and MIDI routing software on one of the
sound computers, but since this was the first test of this combination of
technology, and standalone MIDI merger/router seemed a more stable choice.
approach also made programming the MIDI in Live easier, since it's easy to get
confused as to what note you are dealing with in Live, but when adjusting the
velocity, numerical feedback is displayed. Most of the show was triggered on
MIDI note 0 (the lowest note C on the scale), with velocities becoming the real
Creating a single lighting trigger took several steps. A MIDI note trigger was assigned. This information went into the appropriate Live MIDI clip at the point in time the cue is to be triggered. The same note/velocity info is also entered in SFX. This is the note/velocity message SFX is listening for. When it hears this message, it sends the corresponding MSC command out to lighting. The last step in the setup process is capturing the MSC command data from lighting. We were using the SFX Pro Audio version. There is also a Show Control version, and a combined Pro Audio/Show Control version. Since we were on a budget (and mostly operating out of my personal money flow) I opted to stretch the Pro Audio version into doing what I needed. This meant instead of simply being able to program "Lighting Device 1, Go to Cue 28.6," we did the same thing, but essentially recorded that message from the lighting MIDI out port. Once captured, this data can be sent back to Lighting to trigger the cue in performance mode. Occasionally, we would capture bad data for some reason, but recapturing the message always fixed the problem.
Now that all of the known bugs have been worked out of the system, I plan to make a template show, taking care of most of the tedious time consuming work of setting up triggers and velocity settings. I plan to use the same technology in my thesis production, expanding on it's incredible potential. Combining SFX's abilities to store and transmit MIDI Show Control data, and Live's ability to perform musically has enabled us to do something that, as far as I know, has never been done in this way before. Best of all, it uses off-the-shelf technology that can be programmed by anyone who is so inclined.
great feature of Live is that it works on both Macintosh and Windows computers,
meaning I was able to program and test the concepts on my laptop, and then when
we entered the space I transferred the project over to the PC that would be
running the show.
The Revenger's Tragedy has been an exhilarating show to design, and I've learned more in the last two months than I have in a very long time. I'm completely sold on Live 4 as an incredibly capable sound design tool. My next personal project is to prove that the same concept of electronic vamping can be accomplished with similar techniques for music that is not electronic. The nature the music we chose made this project possible in a reasonable time frame. I'm convinced it can be done with more orchestral music, now it just remains to actually try it. I know it's possible, albeit with a somewhat more complicated series of autofollows and loops, and certainly with more care in editing of the source music. Minimal Tech-House music is just more forgiving than music with longer sustain and decay times.