The Revenger's Tragedy
Unique Sound Design Challenges
By Matt Griffin

When we started talking about The Revenger's Tragedy over two years ago, we knew the basic feel of the show. Everything would be synthetic, detached in some way from the equivalent real-life version. The lighting would be reminiscent of club lighting, and the music would be electronic. We also knew that we wanted to link the control of lighting and sound via MIDI Show Control (MSC.)


In developing the opening sequence of the show, I knew I was facing a challenge that few people outside of sound design realize is a major problem. We wanted to be able to use a piece of music we'd found, but we wanted to be able to cue different sections of music at will, without having to worry about the actor's exact timing remaining perfectly consistent from day to day.

At first, I attempted to do this using Stage Research's SFX software alone. I'd used SFX many times before, but I'd never attempted to make changes in the music sound perfectly musical; that is to say I've cross faded between pieces of music with SFX, but it's nearly impossible to make that change imperceptible to the audience. Because the music we chose for Revenger's is electronic, it was incredibly important to never lose the "four on the floor" house beat. Even a person without any musical training can tell if the music gains or loses even a fraction of a musical measure. With real musicians, a little variation in timing is more acceptable, and is sometimes integral to the sound of certain musical genres.

In designing a show to run on electronic equipment, i.e. designing any show that is not going to be performed live by a band or ensemble, is sometimes like working backwards. If the music was performed live, we could have just told the musical director/conductor to "vamp" certain sections of music under an actors lines. This is a common occurrence in musical theater. Even if a musical theatergoer has never heard the term "vamping," they know exactly what it means upon description of the idea. With a live band, sections to be vamped would be indicated in the score. With computer playback, the end result is similar, but the method is obviously much different.




So the goal became to play and then vamp sections of the music. It quickly became evident that the SFX software was completely unsuited to this task, especially since the nature of theater is evolutionary. To create the desired product using SFX, I would have needed an immense amount of time for programming, editing and troubleshooting. SFX sees audio as simply files full of data, and doesn't know to treat four bars of music any differently than it would treat a non-musical sound, like the sound of thunder.

While facing the problem I had with SFX, I was playing with another application, called Live 4, which is specifically designed to work with musical loops. In Live, you can tell the program where the musical beats fall within an audio file. There are several different methods to accomplish this, but in the end they accomplish the same task. Live is relatively new to the sound world, and is marketed as primarily a tool for musicians. Its' billed as a "sequencing instrument," but the limits of what it can do are really only bound by one's imagination and patience in experimentation.

Combining Live's musical timing with its new autofollow features and it's ability to be triggered remotely by simple MIDI commands made the opening of The Revenger's Tragedy possible. However, just as Live helped solve one problem, it presented another.

At this point, I had decided to use two computers to run the sound rig. One would run SFX and one would run Live 4. The SFX machine was responsible for dealing with the triggering of all cues, both lighting and sound. There was a main cue list that the operator triggered, and a second cue list full of light cues which needed to be linked to sound events. Now, the sound operator could easily trigger a sound cue in Live and a lighting cue at the exact same time. This would mean that if Live received a GO message, it was programmed (in most instances) to wait for the downbeat to execute the change in music. This way we avoided any musical oddities in timing, like a measure of music cut short, or a measure that seemed to repeat when it wasn't supposed to. However, if SFX was sending it's lighting GO message and a Live GO message at the same time, lights could potentially trigger early by as much as nearly a full bar of music.

The answer came from Live's ability to handle MIDI as well as audio. Introduced with version 4, Live can host virtual instruments such as synthesizers and samplers, but it can also send MIDI data out of any MIDI output available to the host computer. This feature, combined with the ability to program SFX to listen to a MIDI port and trigger certain cues when it receives specific MIDI messages made the perfect synchronization possible.



The final control chain worked as follows: The operator presses GO in SFX, triggering a sound cue in Live. Live receives the command, and waits until the next musical appropriate point in time to execute the GO. When the next preprogrammed opportunity presents itself (no more than 1 bar of music), Live takes it's next cue, and at the same time sends a MIDI message back to the SFX computer. SFX hears the confirmation message and sends the appropriate GO message to lighting, triggering the light cue that corresponds to the change in the music.


As soon as this setup occurred to me, the rest was really just troubleshooting. Limitations in the ETC Express Lighting Console and in the JandsHOG Moving Light Console required a MIDI patch bay. The patch bay routed MIDI between the SFX computer and lighting control. Bi-directional MIDI communication was needed between the rigs so that MIDI data could be both sent and received by both rigs, and from both rigs. Theoretically, this could be done with a multi-port MIDI interface and MIDI routing software on one of the sound computers, but since this was the first test of this combination of technology, and standalone MIDI merger/router seemed a more stable choice.

Since Live's MIDI capabilities are really designed to interface with audio technology and not lighting control systems, a workaround was needed. My first plan was to assign each lighting cue a MIDI note-on message. Live would simply "play" a lighting cue, as if it were playing a note on an outboard synthesizer. Of course, since there are only 128 MIDI notes, it seemed as if we would be limited to triggering a total of 128 lighting cues. Since I had no idea how many cues we would be linking between lights and sound, this 128 cue limitation worried me. Then I thought of a solution.

SFX can be programmed to react to almost any MIDI message. The simplest, of course, is just any note on or off at any velocity (velocity is akin to volume). SFX can also treat individual velocities of a single note as different triggers. So instead of being limited to 128 different cues, now we were talking about 16,256 different possible triggers (128 MIDI notes multiplied by 127 velocity increments=16,265 different possible messages. To my knowledge, no light board can store that many lighting cues on a single disk, so the issue was made immediately moot.)

This approach also made programming the MIDI in Live easier, since it's easy to get confused as to what note you are dealing with in Live, but when adjusting the velocity, numerical feedback is displayed. Most of the show was triggered on MIDI note 0 (the lowest note C on the scale), with velocities becoming the real "trigger."

Creating a single lighting trigger took several steps. A MIDI note trigger was assigned. This information went into the appropriate Live MIDI clip at the point in time the cue is to be triggered. The same note/velocity info is also entered in SFX. This is the note/velocity message SFX is listening for. When it hears this message, it sends the corresponding MSC command out to lighting. The last step in the setup process is capturing the MSC command data from lighting. We were using the SFX Pro Audio version. There is also a Show Control version, and a combined Pro Audio/Show Control version. Since we were on a budget (and mostly operating out of my personal money flow) I opted to stretch the Pro Audio version into doing what I needed. This meant instead of simply being able to program "Lighting Device 1, Go to Cue 28.6," we did the same thing, but essentially recorded that message from the lighting MIDI out port. Once captured, this data can be sent back to Lighting to trigger the cue in performance mode. Occasionally, we would capture bad data for some reason, but recapturing the message always fixed the problem.

Now that all of the known bugs have been worked out of the system, I plan to make a template show, taking care of most of the tedious time consuming work of setting up triggers and velocity settings. I plan to use the same technology in my thesis production, expanding on it's incredible potential. Combining SFX's abilities to store and transmit MIDI Show Control data, and Live's ability to perform musically has enabled us to do something that, as far as I know, has never been done in this way before. Best of all, it uses off-the-shelf technology that can be programmed by anyone who is so inclined.
Another great feature of Live is that it works on both Macintosh and Windows computers, meaning I was able to program and test the concepts on my laptop, and then when we entered the space I transferred the project over to the PC that would be running the show.

The Revenger's Tragedy has been an exhilarating show to design, and I've learned more in the last two months than I have in a very long time. I'm completely sold on Live 4 as an incredibly capable sound design tool. My next personal project is to prove that the same concept of electronic vamping can be accomplished with similar techniques for music that is not electronic. The nature the music we chose made this project possible in a reasonable time frame. I'm convinced it can be done with more orchestral music, now it just remains to actually try it. I know it's possible, albeit with a somewhat more complicated series of autofollows and loops, and certainly with more care in editing of the source music. Minimal Tech-House music is just more forgiving than music with longer sustain and decay times.