Creation of the iLisa demo soundtrack

I recently worked with the Carnegie Mellon Computer Club on a demo entitled Introducing the iLisa, which won 1st place in the Assembly 2012 Real Wild Demo competition. I composed the music and wrote some supporting code, for custom sound hardware designed and built by the CMUCC team. Before I say too much more, here is the demo itself:

The Hardware

The “Voice of Lisa” sound card is a Yamaha OPL3 FM chip with a 16-bit DAC and a CPLD capable of playing stereo PCM samples at a fixed sample rate. In this way it’s pretty similar to the hardware of the Sega Genesis or Sharp X68000, which both combine an FM chip with (usually) non-resampled sample playback. Not similar enough, though, that there’s any existing tool I could use to compose for it. The OPL3 chip, used on the Adlib and SB16 PC soundcards, maps poorly to the more common OPN/OPM family of chips (which have less polyphony but a wider timbre range).

The Approach

The CMUCC guys handed off some nicely detailed specs on the hardware with register maps, and a Java test program that listened to live MIDI input and converted it into instructions for an OPL3 emulator. The general idea was that the final soundtrack would simply be a timed list of OPL3 instructions and another list of PCM hardware instructions, to be played back at 60hz. The test program would output this list at some point in the future.


Test program, which actually contains a decent OPL3 patch editor! I made about 3/4 of the instruments in the song, others are from AdPlug’s built-in OPL set.

The next step was to set up a software environment where I could write music, play it through the test program, and have it synchronized with PCM samples. I was able to get everything working with OpenMPT, using its VST to MIDI plugin in Channel Mapped mode, and Midi Yoke routing the MIDI through a virtual cable from OpenMPT to the test program. The alternative would be to route 16 channels of MIDI through MIDI Yoke in a traditional DAW/MIDI sequencer such as Sonar, but I prefer this way for management of the OPL3’s 18 channels of polyphony, as seen below:


The channels map 1:1 to MIDI channels, which also map 1:1 to OPL3 channels. There is no dynamic allocation or chance that notes will overlap.

This method also has a few other benefits due to some clever behaviors of OpenMPT meant for controlling VST instruments with tracker commands.
• I can easily do a vibrato (Hxx) that maps to pitch wheel events (I added pitch wheel support to the test program for this reason)
• I can do ‘legato mode’ style leads by postponing note-offs automatically (Gxx)

And of course, OpenMPT can play my drum samples along with the song just fine. So using this setup, I wrote most of the soundtrack, running the OpenMPT MIDI live through this Java program in combination with the drums.

I was able to send the CMUCC guys a preview recording of the song, and we came up with a rough estimate of how long the song needed to be, and the various sections they wanted.

Conversion

At some point I needed to deliver something besides an mp3. In addition, there was a nagging problem with the current setup – the timing randomly jittered by up to 10ms. This was unavoidable due to the combination of the VST-MIDI plugin, MIDI yoke, and the way the test program would only update on its on 60hz timer. I needed to convert the song to a MIDI with frame-exact timing. Then I could modify the test program to play back the MIDI.

I had to write my own .IT player for the conversion (I had some prior code which got me most of the way). OpenMPT’s MIDI export was unusable for this level of control. In the converted MIDI output, one tracker tick is equal to one MIDI tick which is equal to one frame.
At the same time, I added support for the drums into the test program itself, triggered by a CC message.

Once everything was working, I could save a change to the song, and have the test program play back a .mid file with included PCM drums, with perfect timing. I sent my modifications to the test program back to CMUCC, and let them add in the register log capability and finally play it on the actual hardware. In addition, now I could just send them the .mid for every future update, instead of making a recording. I also added in timing messages (one CC sent for each section) so the effects could be synced to the song.

At this point there were only 2 days left before the Lisa would be shipped by air freight to Helsinki.

Sync

On the day after the Lisa was already gone, I got a video of the demo. I now had to try and match the song to the effects, rather than letting them match the effects to my song. After some cutting/repeating of the song sections to get it pretty close, we also identified a point where the beginning of an effect could be quickly trimmed out and get the rest of the demo in sync.

Results

I finally got to see the finished product after the demo was put up on Assembly Archive, soon after the competition ended. Everything actually seemed to be in sync! We didn’t expect to beat the flashier Android/WebGL demos, and it seemed to me there was barely any context for this demo. Did people know about the custom hardware? Either way, apparently it had its charm. Congratulations to CMUCC for all their hard work, especially kbare and mdille3 who were my two contacts, and I’m glad I could work with them on a well-received production.

5 thoughts on “Creation of the iLisa demo soundtrack

  1. Wowee! This is awesome to hear, coda! Clearly an amount of effort that’s about to be massively undermined by people like me going “Durr I like da music”.

  2. I was wondering on what hardware the tune was played, awesome to learn how much effort went to its creation! Great work guys.

  3. Pingback: Nullary Sources

Leave a comment