This fall, while I was reorchestrating my revised opera, I made virtual instruments a bigger part of my process than ever before. Most of us who use notation programs (I use Finale) rely, at least to some extent, on decent midi playback of our music, though at the same time most of us working professionally as composers know you can only rely on that playback to give you limited usable intelligence.
It is also generally agreed upon - at least in the circles in which I run - that midi playback in the professional engraving programs (by which, for now, I mean Finale, Sibelius, and, I guess, Dorico) is clunky and difficult to control. The concert music composers (for lack of a better term) I've spoken with who make a real effort to create high quality, realistic midi mock-ups of their compositions, almost to a one, do the actual midi manipulation in a DAW such as Logic or Cubase, rather than attempting to fight with the notation software. So the process is something like, compose the piece in the notation program using the built-in, inferior sounds, then export a midi file, open in the DAW, and really perfect the midi rendition of the piece. (And there may be a very good reason for doing this - like entering the piece into a competition, or trying to secure a commission for it.)
Interestingly, in my green foray into this world, I discovered that for composers working in film and video games, the process is reversed. That is to say, composition takes place in the DAW - often with the piano roll editor as the main composing environment - and then, only if necessary, the midi may be brought into a notation program to generate parts and a score for live musicians, who will only set eyes on this music if the budget allows it. I've learned a bunch about this world by hanging out on the vi-control forum, where I always feel very much like an alien (in a mostly enjoyable way).
Never one to be satisfied with the conventional wisdom, I spent a good part of the fall trying to master the vagaries of Finale's "Human Playback" system - which is the built-in system of translating musical notation into midi events. For instance, the presence of staccato markings over specific notes in the score needs to trigger a switch to the staccato sample, say on cello, so that you're not just hearing the same sound with shorter note values, but a different sample altogether - one actually played staccato. I struggled with customizing Finale's Human Playback settings for use with "third party" sample libraries (i.e. not the built-in Garritan samples) for the entirety of the orchestrating process, until, by the end, I had gotten quite good at it - and also come to understand the real limitations of manipulating midi data in a program like Finale (it is less limited, I think, than a lot of people realize, and yet still... quite limited. I dream of writing a tome on the subject).
When I was done orchestrating and preparing parts, I spent the last couple of weeks of 2016 doing an elaborate midi mock-up of one of the opera's scenes - the finale of Act I. I already had the pretty-goodish midi demo generated directly from Finale, but I wanted to see if I could achieve something better. I think, in a way, it was my way of not being able to let go of the project that had consumed so many years of my life. The result has clear strengths and weaknesses. Any midi mockup of a human voice begs a lot from a listener, and I'll understand if you don't survive nearly 9 minutes of this. But the voices here aren't actually that bad (the soloists are Vienna Symphonic Library's Vienna Solo Voices) - there are even a few moments where I think they sound genuinely pretty! Also - if you feel like fast forwarding to about the middle, the character of the music changes from dissonant contemporary art music to a lilting, ensemble mariachi number (and there's a lovely little guitar cadenza right at the end).
When I was done with that, I finally let myself let the opera go, and tried my mockup skills on someone ELSE's music. Here's a quick job I did of Stravinsky's "Greeting Prelude," which is an adorable little version of Happy Birthday he wrote in celebration of Pierre Monteux's birthday. I entered the score into Finale, exported midi, and then played around for a couple of days in Logic:
After that, my semester started (after a fall sabbatical) and I was mad busy. Didn't really have time to play around with this stuff much. But then, in a composition lesson, a student played me some film music that had inspired him - just for piano and cello. More and more students' first exposure to music composition these days is via films, and, to an even larger extent, video games. Hearing the piece he played me, I thought - here's an interesting challenge. Write a piece totally unlike myself, a pretty and sad bit of film music, and write TO a sample library (i.e. write specifically with my virtual instrument's capabilities in mind, rather than thinking of the human who would ultimately play it, with the midi playback just a stopgap facsimile of the real thing). I picked up this very cool cello library by Cinesamples - called Tina Guo Acoustic Legato Cello - and wrote this short little romantic/filmy thing. My goal initially was just to be convincing and realistic, but like any music I work on, I grew to like it a good deal - you have to.
That's a little walk through some of the virtual music making I've been doing - mainly just to teach myself this world, and also to find a point of connection with some of my students. But I think also because I love the idea of just working up a complete thing, something not requiring interpretation or expensive rehearsals... something I just have and can share. I think that's what drew me to making rock albums. Not exactly sure where this little adventure takes me, and not sure how long I can dabble with it in the face of upcoming projects. But for the time being, I am a happy dabbler.