THE LOST WORLD, LIVE AT THE BARBICAN CINEMA
So, this is the first post I’m making in relation to my new job as the research assistant at the Guildhall School of Music & Drama's Electronic Music Studios. I'm so happy and grateful to be one of those lucky musicians who, aside from freelance music work, has a flexible day job centred around music too. Whew!
The second performance event I’ve been involved in for this job was on Sunday 9th June 2013, when the department put on a screening at the Barbican Cinema of The Lost World, a pioneering film made in 1925. It included never-before-seen stop-motion animation, and the creators were even able to blend live action with animated elements in the same frame (as seen in the above image). Willis Harold O’Brien, the man mainly responsible for the animation and visual effects in The Lost World also went on to animate King Kong. Both King Kong and Jurassic Park owe a great debt to The Lost World, and would probably not have been made had it not been for the technical and narrative inspiration provided by this film.
We screened the show on the Barbican cinema’s main screen, with students performing their own original music and electronic sound design live onstage. From a technical standpoint, this was an ambitious and elaborate performance. The onstage area was split into 3 zones:
1 - Acoustic instruments
2 - Electronic instruments
3 - Live sound design
Rehearsing at Guildhall
We used a SMPTE timecode signal, in audio form, to synchronise the computers onstage with the film, that was running from a DVD in the cinema. So, the SMPTE code came from the DVD, and was sent into the onstage computers. We used a small application called SMPTE Reader to convert SMPTE into MIDI timecode, which was then sent into a virtual MIDI port in Max/MSP. For anyone that ever needs to do something like this, I can’t recommend SMPTE Reader highly enough. It’s a small and simple application that does exactly what it says on the tin. As far as I could find, it was the only thing that could easily achieve what we wanted… and it’s free! There are a million different ways this can be useful.
Once we had MIDI timecode that was controlled by the timeline of the film, we were able to trigger events in Max in accordance with the onscreen action. We had a 5-strong team dedicated to live sound design, who relied on this system. For example, some were making sounds orally into a microphone, that were processed live to sound like dinosaur roars at the appropriate time. The timecode input enabled Max to switch to different processing settings ready for different scenes.
Below is a diagram showing how the equipment was routed onstage:
The SMPTE timecode signal from the DVD was sent to CPUs 1-5, where it was decoded to trigger parameter changes and offer a visual counter to guide the musicians using those machines. CPU1 made a submix onstage of the signals from CPUs 2-5, who were our live SFX team. CPUs 6-8 were delivering electronic musical elements, sub-mixed onstage with a small mixer. The 3 microphones were for live acoustic instruments. The headphone amp sent the click track from the DVD to the onstage musicians.
The equipment for this performance was fairly hefty, and we were very reliant on a great number of things not screwing up. In our final technical rehearsal on the day, it felt as though pretty much every element of the setup screwed up once at different points, but we’d managed to iron out the kinks very quickly and the performance went perfectly!