I just published another piece I've been working on these past few days. The first part actually comes from a song I wrote at least 15 years ago, but that I then unfortunately lost forever due to a broken disk: I didn't like how the song progressed at the time, so I started from scratch to re-imagine it to tell the myth of Orpheus and Eurydice instead:
If you're unfamiliar with the story, Orpheus was a legendary musician and master of the lyre. He was married to Eurydice, who was then killed by a viper when trying to escape a threat. The myth then tells how Orpheus played his lyre to try and convince Hades and Persephone to allow Eurydice to leave the Underworld. They accepted, at the condition that he walked in front of her and never looked back until they're outside. Orpheus did that, but towards the end started doubting she was really, or still, behind him, and that maybe the Gods tricked him: just before getting out, he turned, and she was gone. Greek myths really loved a good tragedy!
The track I wrote tries to tell part of that story: Orpheus reaching the Underworld, and bringing Eurydice back. I tried to make Orpheus' lyre (well, harp in my piece... try finding a lyre sf2/sfz! ) the only instrument that was always there and almost always with the same pattern shifting in and out of the different tones, to try and convey the feeling of his music trying to overcome all the other instruments, from the hopeful start up to the tragic ending: when Eurydice is gone, the lyre abruptly stops, and all is left is a mourning melody. Did he lose here because of his fears and doubts and because he turned? Or were the Gods really tricking him? My track won't be the response to that: I just wanted to play it all from Eurydice's perspective, from when she first sees Orpheus coming, they start climbing their way up to outside, his fears starting to take over and then the moment of loss: when the sound of his lyre, coming from an Orpheus she's now lost forever, slowly disappears.
Long intro aside, and coming to what probably may interest you most, this was my first attempt to use MIDI much more heavily in an Ardour-based project. I realized why people often complain about the state of MIDI in Ardour: I definitely found the whole process quite cumbersome at times, and I'll try to explain why in the next few paragraphs. More precisely, I tried working on the MIDI tracks in different ways, to experiment with the different ways I could take advantage of that.
Almost all of the tracks were written in Frescobaldi, so using Lilypond. I learned to love the syntax these past few months, and I found it much easier to write long phrases that way, rather than drawing on a piano or music scroll. This was the first cause of annoyance, though. Importing MIDI is simple in Ardour, and works nicely: but there's no way to update a track when it's imported. Any time I updated even a single track in the MIDI, I'd have to re-import the whole thing again, then copy the new track (e.g., for the flute) and put it in place of what I had before, and remove all the tracks from the newly imported MIDI again. Very cumbersome... The MIDI editing part of Ardour is quite bad (more on this later) so it definitely was not an option: I did do that briefly, to shorten some notes in the horns section for a more "staccato" effect, but it was quite painful . If Ardour had a way to externally edit a MIDI track when it's added, that would be amazing, but I'm afraid that's not possible. The other cause of annoyance is that Lilypond doesn't have a humanizer (at least not that I know of), and so all notes have pretty much the same velocity, which sucks... unfortunately, Ardour doesn't have a "Humanize" option either (while it does have a quantizer), which would be great to have too; from what I read, you should expect plugins to provide that (e.g., those responsible for sequencing), but I didn't find any.
That said, this is how I handled and sequenced the different tracks:
- For the harp (which, for a few effects reason, is actually three separate tracks, although there's a main one), I used a SFZ called "Baroque Harp" from the Early Music Ensemble collection (https://musescore.org/en/node/267479). The other options I found (VPO, Ethereal Winds, some other SF2/SFZ) seemed much more lackluster than this. Anyway, Ardour does have a linuxsampler plugin you can use for SFZ: I found it a bit weird that the plugin is only a "stub", though, and that you have to configure it externally with QSampler... anyway, once you do it, it remembers it, so that worked. The only thing I had to do was to ensure only Channel 1 was used: if it was all of them, it would not work.
- Most of the other instruments (flute, fiddle, choirs, horns) were rendered via the default Fluidsynth soundfont, using the Calf Fluidsynth plugin. Again, I couldn't find better sounds for those instruments and the effects I was after, for some reason (I expected VPO to sound much better, in particular!). Anyway, the Calf Fluidsynth configuration was quite straightorward.
- The background strings you can hear in two separate tracks (ensemble and cellos) I wanted to be played using a Windows VST. I recently found some free VST's by Spitfire Audio (https://www.spitfireaudio.com/labs/) that sound really cool, and so I wanted to check how hard it would have been to use them in Ardour. Turns out apparently Ardour doesn't support VST's out of the box (you need a special build), so I had to be a bit creative. I used jack-dssi-host to open an instance of the VST using dssi-wine, and then spawned a j2amidi_bridge instance since jack-dssi-host only exposes an ALSA MIDI port, while Ardour only works using Jack MIDI. Then I configured the MIDI track to output to j2amidi, and added a new Audio track that would get its audio from the VST output instead. This way, I could record the rendered audio and work on that in the mix. A bit problematic if you expect the tracks to change often (you'd have to re-record them), but it did the trick.
- The pad at the end is rendered using ZynAddSubFx as a plugin, which was quite easy to configure too. To make my life harder, though, I decided to actually WRITE the pad notes using the internal Ardour MIDI editing. As I anticipated I found it quite bad, I'm afraid to say, and can understand all those that recommend staying away from it... I couldn't find easy ways to have notes stick to the grid for some reason (I guess you can just quantise later?), and even durations are not discreet. Luckily enough I only had to write very few notes, so I was done with that soon.
- Finally, drums are the usual Hydrogen+DrumGizmo combo, and nothing to complain about there: I don't think I'll ever need a different approach there, and scripting those percussive sections using kicks and toms was quite fun!
Another cool thing I experimented with was reverbs. One of my best friends, who is a composer and a damn good sound producer and engineer (I mean, just hear his albums on Spotify and tell me they're not amazing! https://open.spotify.com/artist/7s01ygGi7gw8oFmvxH5MU8) has been giving me a lot of tips since I started this journey on music production in Linux. Yesterday he introduced me to the concept of aux sends, and how they can greatly improve the way reverb works, especially when you're working on a lot of tracks! I sketched this in my project and it did simplfy the process a lot: you basically create a separate bus track, only add the reverb there (configured with only wet and no dry), and add an aux send for all the tracks that should have it to the bus track. I ended up with two separate buses, because I needed different reverb properties for the intro/outro, and the song itself: on the whole, the end result could be much better, but in my defence there's a lot you can tweak there, between how much to send to the busses and how to configure the reverbs themselves... There will be plenty of time to improve later!
That's all, hope you'll enjoy the track and I'm looking forward to your feedback!