Control instrument choice of yoshimi, fluidsynth, timidity

Support & discussion regarding DAWs and MIDI sequencers.

Moderators: MattKingUSA, khz

Post Reply
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

I want to get some synths added to my mix. I've been able to route my external keys(Oxygen 24 V1, and a basic Yamaha) into my system with no big issues. I run this a2jmidid if I want to do the routing in Jack. I also have a older midi drum pad, KAT DK10. I have been fond of ZynAddSubFx since I first met it. My distro I am using Studio Linux(?), had Yoshimi installed, but I see I can install Zyn if I felt like it. I'm a little confused as to why the two forked, I'm guessing just developer 'issues'. I guess it's good they are both in active development anyways, yea! I install this midisport-firmware for the Oxygen to get a MIDI connection from the USB. For the Yamaha I use a cheap midi to usb cable.

Works great, I can have them both connected to the same Yoshimi and it works. But instrument selection is a painful. I want to automate it. I don't want to mouse around. At some point a year ago I had mididings figured out enough to make some kinda split where the lower few keys could be used to change the program/instrument. But I can't find that anymore, and I don't think I want to put those on the keyboard. I'm think I want a program that I can assign single key-strokes to push out a program change. Maybe that's a little custom python script. I installed python-pypm, and that looks promising to cook up a little console app to do this for me. Maybe there are better existing ways to do this that some of you might suggest. Seems like this would be a common need. There's this Pure Data that looked pretty decent for rigging up random hardware and doing stuff like this. I am interested in trying out like joysticks, things like that as el-cheapo synth controllers, say for pitch bend, after-touch, modulation, etc. I'm also interested in maybe making foot controllers out of em. Maybe there are inexpensive USB foot controllers already? I understand there is a new OSC protocol or something trying to improve on MIDI. Are there any good(and preferably inexpensive) usb single foot pedals - think the blues guy who plays guitar and has a single bass drum and maybe high hat to wack with his foot. But instead of the real drums do it with the synth and a usb pedal.

I mention FluidSynth and Timidity because I have used those and like them. Starting to play with Hydrogen, got it to run from this drumpad pretty easy, as a midi driven synth. But the Yoshimi/Zyn is my main target to control at the moment - I assume most of these synths can be controlled by midi. One annoyance I have with Yoshimi, is mousing to try out the instrument selection. It does not show anywhere that I can see what I selected, and I try so many of them, I simply can't recall if I selected GlassPad 1,2,3 or 4, you get the idea.
That's a minor inconvenience. I want to find the ones I like, and make a list of these, and quickly switch between them with a single keyhit of the keyboard.

So my yoshimi is stock Studio Linux package (showing Ubuntu 13.10 on cat /etc/issue). dpkg -l yoshimi shows 1.1.0-2, wonder if there are any updates or newer release I should hassle myself with trying to install or compile. Any opinions on yoshimi over Zyn?

Any and all comments/opinions/suggestions welcome. Thanks for reading ;)
folderol
Established Member
Posts: 2080
Joined: Mon Sep 28, 2015 8:06 pm
Location: Here, of course!
Has thanked: 227 times
Been thanked: 400 times
Contact:

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by folderol »

I don't think here is the right place to go into the reasons for the split, however (while still instrument compatible) Zyn and Yoshi are going in different directions.

The current version of Yoshimi is 1.3.9. The one you have is very old. All the up-to-date distros carry the new version, or yuo can get it here.
https://github.com/Yoshimi/yoshimi

You can now select banks, bank roots, and programs (instruments) entirely by sending standard MIDI CCs. Indeed there is a youtube demo of this being done with a single MIDI file setting up everything from a 'bare' install.
https://www.youtube.com/watch?v=rDIvkfoYYss

P.S.
The way you access these, and the number of access points have been very considerably improved, along with general layout clarity.
The Yoshimi guy {apparently now an 'elderly'}
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

Thanks for the info. I see there are already one or two threads on yoshimi/zyn., they are both good, and in development. That's great, I'll probably try both of them ;) Looks like for midi control messages it would be useful to upgrade to a recent version. I did find where it shows the bank/instrument selected on the main screen. I noticed my Yamaha PSR-290 does actually send out program change MIDI messages when I switch instruments on it's keypad - I didn't think it sent them, but they are showing up in my midi-monitor. And I know my Oxygen MidiMan 8 can send these messages using button key sequences, I will get that figured out first as it's probably a good general tool, then try to get these to change Yoshimi instruments. The oxygen MidiMan V1 8 doesn't have any general purpose buttons, but it does have about 8 knobs/pots that send out midi control messages. A value of 0-127 shows as you turn them. I was thinking that would not work well for changing instrument selection, but I guess it could work fine. Any of these will require some routing/mapper so I can make my own smaller list of the ones I like and want to use. I think doing that in a custom script using python might be nice, so I'll probably be working something up for mididings and/or pymidiport API. I guess I'm surprised there are not more scripts, apps that do this sort of random user MIDI controller mapping for you. Maybe there are and I just haven't found them yet.

I guess the sky is the limit for controllers/foot-pedals, a quick ebay search finds plenty of these sort of things. Depends a lot on what your use-case is. So for example, if you have guitar strapped around you and playing it, you probably want a foot pedal. If you were doing it for a real gig/stage, the old MIDI might be useful, seeing as how a usb cables have limited length. If I'm just overdubbing something, maybe the laptop keyboard is fine or rigging to the midi-keys. I tend to be a cheap DIY'er, so I kinda like the idea of trying to use some commodity hardware. I've heard of people using smart phones for IO. I suppose, the wireless could be nice. Wireless joysticks/keyboard/mouse might be interesting to play with. Anybody here doing this with Pure Data? I saw a few examples that looked like it might be a good tool for whipping up a quick script to act as a MIDI controller/router, that it had built in support for midi, joysticks, etc. I tend to favor python, it's my favorite power tool, so that's probably where I'll go.

I was thinking about analog/foot/drum pedal ideas. One idea is to put some audio/mike element in a box. When you stomp the box, you get a click, that goes into a cheap usb sound dongle, gets routed to some little app that listens to it and coverts it to a MIDI note-on that gets routed to a drum synth. Fun stuff to kick around ideas on. And it's neat that this stuff is all very do-able, especially with the infinitely tweakable linux.
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

Update, I did find my mididing code I was working on over a year ago.
It had a flaw when used with the Yamaha YPT-300 keyboard, it kept kicking out all the program changes constantly. This turned out to be from the Yamaha sending a constant stream of SysRt timing messages, and resulted in an endless stream of Program changes. I was using the Gmidimonitor to show midi msgs, but it apparently filters these SysRt ones and does not show them, so that stumped me for a while. I hooked up the keyboard to qsynth. That has a helpful Channels panel you can bring up that shows what channel,bank, program is active.
So adding this extra Filter(NOTEON) filter those out. Here's some of the working code that allows program change using the first 4 keys.

Code: Select all

portn = 0                                                                      
chn = 0                                                                        
prog = ( [ Filter(NOTEON) >> KeyFilter('c1')  >> Program(portn, chn, 3), 
               Filter(NOTEON) >> KeyFilter('c#1') >> Program(portn, chn, 6),
               Filter(NOTEON) >> KeyFilter('d1')  >> Program(portn, chn, 8),
               Filter(NOTEON) >> KeyFilter('d#1') >> Program(portn, chn, 9), 
               KeyFilter('e1','c5') >> Channel(chn) ]                                            )                                                               
This mididings stuff is pretty bizzare. I thought my head was going to explode at times trying to deal with it. Ugh. It's kindof a neat project, some wicked C++ code does the work behind the scenes to try and make it fast/real-time. The Python is just used as a sort of configuration language. They do provide a way to call into python during the real time workings, but then the response times could become a problem so it's not encouraged.

I have some interest in making what I will call a 'layered synth'. A friend has a higher end Yamaha that seems to add things depending on the velocity. So if you hit the key hard, it maybe adds a big 'gong' or 'bong' to the note. That's what I mean by layered. Seems like you could do that with just some MIDI routing/coding logic.
Here's a little snippet of the(non-real time) mididing code where I tried to do something like that, it tries to add a note based on velocity.

Code: Select all

       result = [] 
       result.append(NoteOnEvent(ev.port, ev.channel, ev.note, ev.velocity)
       if ev.velocity > 32:
             result.append(NoteOnEvent(ev.port, 9, 34 + ev.note-32, ev.velocity))
       return result
Hope to get a fresher version of Yoshimi installed soon that can process the program/channel messages.
folderol
Established Member
Posts: 2080
Joined: Mon Sep 28, 2015 8:06 pm
Location: Here, of course!
Has thanked: 227 times
Been thanked: 400 times
Contact:

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by folderol »

If you are familiar with NRPNs you can set up vector control, whereby just two user defined CCs give X/Y control of 4 linked instruments. Default is any combination of Volume, Panning, Brightness. Details are in yoshimi's doc directory.
The Yoshimi guy {apparently now an 'elderly'}
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

Cool, yes I want to get a joystick fondling my synth ;) I did manage to compile the latest Yoshimi, will be test driving next.
And just for grins and giggles, here's some links to my yet unreleased upcoming big smash hits as the major recording music star kbongos under the major recording company KbongosMusic MEGA-CORP:
'Get Your Buzz On': http://kiwi6.com/file/1c3hie4adu
'Fuzzy Thinking': http://kiwi6.com/file/dgwnebefe5
The first one is so bad it might be good. First attempt to add yoshimi in, but it came out interesting.
Actually I think I'm going to throw away the Fuzzy Thinking, it's just a terrible doodle, but it does demo a 2nd attempt to add Yoshimi to some guitar, using Audacity. I just love the deep organic sounds that come out of that synth, it's like not digital at all.
folderol
Established Member
Posts: 2080
Joined: Mon Sep 28, 2015 8:06 pm
Location: Here, of course!
Has thanked: 227 times
Been thanked: 400 times
Contact:

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by folderol »

Keep going. You'll get there :)

As for the richness of sound, that is all down to the work done by Paul Nasca. When he did the original sound designs he was one of the few people who recognised that real sounds are never a single frequency, but always a band of frequencies that interact with each other.
The Yoshimi guy {apparently now an 'elderly'}
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

It looks like Yoshimi has done some nice work regarding control lately, MIDI control, bank management, a console. I picked up a m-audio radium 49e keyboard that has 16 sliders/knobs. I'm going to start by making a python console app, simple menu/key operation that will act as an output I can send to Yoshimi to test out the midi controller access. I have tested some with the m-audio keyboard midi commands, but ultimately I want this logic in the computer so then the sky will be the limit for what I can do with it. I may come back to mididings, but for now I think a simple console app to send the controls will be a good starting point. Going to try pyportmidi, this is a python wrapper around libportmidi, which is related to these other cross-platform portaudio libs(Audacity uses these).

So I should be able to bump up/down the effects mix level for example. Once I get it from the console app, I eventually want to route it to a knob or slider on these keyboards. I upgraded one of my audio PC's from the Studio Linux installed a few years back to Debian 8. That went fairly well after
a few real-time tweaks - thanks go out to the realtimeconfigquickscan.pl author and wiki notes, that was really helpful.

Also looking at kxstudio, installed and running from a flash dongle, it's interesting to see what apps are on it. It's kinda neat that you can run these live distros off a keystick to demo them, and even install things and save changes. So I installed qjackctl and yoshimi with an apt-get install, tried out qtractor some.

Oh, and one final plug for a x86 mini distro 'Voyage Linux' that I have used in the past and just installed the latest Debian 8 variant in a Virtual machine to test. This is a wonderful distro for embedded/small Debian. It's essentially just Debian tweaked ever so slightly to be small and run on flash. You can apt-get install anything from Debian. It does default to using a read-only primary file system(you need to run special remountrw script for example to make changes), it's initially just console. So it's probably not as friendly to novices as say for example Puppy linux, but I liked the fact that it stays in the Debian camp. So it could be a great starting point for transforming an old laptop or a x86 box into a headless audio machine. Voyage also has variations that are targeted at being a music server/player box.
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

Got some python working to send some basic MIDI controls to yoshimi. Put it over here if anyone is interested.
http://kbongos.com/midi/test_yoshimi.py
Hope to make more progress on automation of bank/program selection, and then routing some of the controls to my synth knobs.
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

Hi Everyone,
I've been making progress on my midi controller experiments. I've been updating the linked python program in
the last message if anyone has an interest. It's primarily targeted at Yoshi, and Oxygen 8 that has
8 knobs/pots as CC-1 to CC-8. But is intended to be a general midi router experiment that could be applied to anything.
It includes console menu selections to send various midi messages as a test-bed,
listen and dump midi messages, a Yoshi specific router that tries to do neat things with the 8 knobs,
and a Yoshi bank browser - reads yoshi config, displays banks, prog selections, sends picked ones out midi.

Three midi-support backend libs are rigged in - pypg(pyPortMidi), rtmidi(lib), and alsaseq(lib).
I'd like to work in some mididings. The pypg does not do 'virtual clients' - so you need to select the midi in/out
targets from a list to connect to, and they don't show up in the system ALSA mappers. The rtmidi supports this style
and the virtual client. Alsaseq only does virtual client. I think the virtual client that you map in the system
has more value, but I am in the investigation phase now, they all seem to have their pros/cons.

Mididings is unique - it doesn't allow just arbitrary use of the midi backend, it primarily runs pre-composed
routing logic. It does allow hooking from that and then applying general python logic to it, so it's got
possibilities, and the composed routing blocks are neat, if you can get your head around them. So I've been ignoring
Mididings for the moment.

I mapped the first 4 knobs to the first 4 yoshi system effects level that I setup as: reverb, alienwah, phaser, something.
These are accessed via midi NRPN's. 5th knob is re-directed to system Pan, 6th knob is interesting, I route it to the
Pan of the last system effect level changed via knobs1-4 - an experiment in arbitrary/interesting/stateful routing.
The last two knobs I just routed to system Portamento(CC-65), and sustain - these act digitally > 64 on, less off.
Would be nice to have just some on/off toggle buttons to use for this sort of thing, but Oxygen just has the knobs.

Yoshi is looking good, these controls can effectively be used for interactive playing - where these controls become
part of the instrument. FluidSynth on the other hand doesn't seem to have much(if any) interactive controls available.
Wasn't really able to test out ZynSub due to a compile time error on latest code(on my Debian 8 stable). Maybe I should
switch to testing or try out one of the new Debian based distro offerings. Want to try AV Linux new release, KXStudio
sounds like a nice repo.

Picked up a Axiom 24 key, which has got some nice different controls on it. It spit out nice midi out of the box,
drum pads routed to ch10 so they worked good to demo. The knobs are different, digital, they click thru 1-127
positions and map to NRPN data controls. This I think will be nice for digital selections, and could be applied
to effects, although I think you have to turn them a few rounds to get the full range. Then there are DAW start,stop,
rev,forward type buttons(about 6) that could be nice for various uses. They map to similar special purpose midi
messages. I like the transpose up/down buttons(the ones on Oxygen are mechanically icky). It's got the polyphonic
key pressure thing, and I see the midi messages come out if I lean on them. Not sure how useful this can be,
doesn't seem much different than what could be done with a wheel(like the 2nd portamento wheel). Haven't really played
with it much yet.

Oh, I've settled on ALSA midi routing versus Jack. Seems like the more direct way to go. I'm not entirely sure
why I would want to use the Jack Midi interface, seems redundant to the ALSA. I'm guessing it's a timing thing,
that if I want to integrate it into DAW recordings, maybe it offers some advantages there.

All for now, over and out.
folderol
Established Member
Posts: 2080
Joined: Mon Sep 28, 2015 8:06 pm
Location: Here, of course!
Has thanked: 227 times
Been thanked: 400 times
Contact:

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by folderol »

Very pleased to see you're making progress with this. NRPNs seem to be very much the poor cousins of the MIDI family, which is a shame.

I've never used Mididings myself, but I gather it's essentially a command line (shell) program. If you're running such an environment it gives you another possible route into Yoshimi's controls. Although a long way from complete the CLI actually gives you some controls that are not available from either MIDI or the GUI :wink:

I also work entirely with ALSA MIDI, but Jack audio. With the real-time playing I do, I've never found latency to be an issue and when working with external keyboards the main timing problem occurs between the keyboard and the chair :lol:
The Yoshimi guy {apparently now an 'elderly'}
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

Hey there!
Yes, you can use mididings as a command line thing. It's a rather strange beast. It is quite elegant from a programming perspective. From a user perspective, it can be real easy and then again real hard. You arrange pre-composed routing blocks, like Transpose() or Filter(), etc to create a custom router of MIDI(it does OSC too). So you can alter MIDI events, filter them, tie them together in blocks. It uses python scripting language to arrange and connect these blocks. The blocks themselves are instantiated as C++ objects(faster code than python). You can arrange a custom 'python' block to hook to perform arbitrary function.

I just worked some mididings into my python script referenced above. I got it to dump out the midi events, so you can see them, kinda like the midi-monitor programs. I also added a mididings hook block that implements the yoshi targeted routing from my Oxygen CC knobs. I haven't figured out how to do arbitrary output commands with it yet with mididings. The basic premise of mididings is that you arrange the blocks and say GO. There's no easy way to start it, and then allow arbitrary user input to adjust it's operation(or send events not triggered by other events). This arrangement can work fine for a large number of things, but not for everything. I wish it would expose it's backend interfaces directly to python, but it doesn't. You need to hook it, and that only responds to events.

The appealing thing about mididings is it is a very well written modern MIDI routing engine. rtmidi is nice enough, and does either type of midi client. libportmidi(pypm) doesn't do virtual midi clients, alsaseq is all ALSA - which could give you useful things like connect/disconnect, alsa channel information, doesn't do Jack. They all seem to have there pros/cons.

Mididings offers something it calls scenes that I need to investigate. I believe it uses threading and OSC messages to interact with the engine. I want to know if it can retain the ALSA midi client while interacting with it(so as not to have to tear-down and restart the midi interfaces).

And, yes, latency is not a huge factor for what I'm trying to do now - which is to change effect controls, this is generally not all that sensitive to latency, if I pan or crank up reverb, switch banks/progs, 10 or 50ms isn't a big factor.

So if anyone tries out my script above, you run it like:
test_yoshimi.py BACKEND
where BACKEND is either pypg, rtmidi, alsaseq or mididings. These are choices for the libraries it loads to interface with the midi. It's sort of a test bed for experimenting with the various MIDI interface libraries. So it prompts you with a menu, what do you want to do; listen and dump midi messages received? Send a CC or note midi command? Run my keys CC-knob to Yoshi router? Browse Yoshi banks? At this point it's just a messy test bed to experiment and learn.

I suppose it's all useless babble if you don't do python or care to grok MIDI. I should probably find time to review what PD has to offer. ZynAdd(some DAW's?) i believe has this 'learning' mode that looks nice for the non programmer, but that will have limitations(one to one control mapping I would guess).

Well, that's probably enough babbling from me ;)
User avatar
kbongosmusic
Established Member
Posts: 109
Joined: Sun Mar 06, 2016 9:14 pm
Location: Minneapolis

Re: Control instrument choice of yoshimi, fluidsynth, timidity

Post by kbongosmusic »

I'm kinda leaning toward rtmidi at the moment for midi interfacing. It seems well supported, middle of the road. I noticed the mididings dropped from Debian testing, unfortunately, I think it does fit a useful niche, hopefully it get's back into Debian. Mailing list showed maybe it doesn't want to be a general purpose midi interface lib. It does try to fill an interesting problem space - where you have building blocks(routing) that you can plug together to make a more complex entity. It uses Python as connecting lines, rather than say a gui program that connects for example plugins. Pd I think attempts to provide the graphic connect the blocks programming realm, but it is definitely not shiny and new, so I am scared off from it's 1970's feel somewhat.

Kinda that whole GUI versus text programming dilemma. Electronic design has been mostly GUI based - schematics, but then along comes verilog and shows text tools have their place. And it always upsets me the lack of text comments in schematics. True with a lot of software text code people write as well.
Post Reply