jack_iodelay and LSP Latency Meter
Posted: Sun Mar 08, 2020 11:02 am
I've tried comparing results from jack_iodelay and LSP Latency Meter. They're very close but not identical.
Using JACK in sync mode, 48kHz, 2 periods per buffer, 64 frames per period the theoretical value for this setup is :
nominal latency = ((1 + number of periods) * number of frames per period) / sample rate
nominal latency = ((1 + 2) * 64 ) / 48000 = 4ms.
For jack_iodelay :
For Latency Meter :

OK, so there's ~1.3ms extra latency. It's my hardware latency which I set earlier at 31 frames on the input and 31 frames on the output. Adding that in to the calculation :
latency = (192 + 62) / 48000 = 5.333 ... ms.
That's pretty accurate although I must say I was expecting the measured value to be higher than the calculated value. The difference between jack_iodelay and Latency Meter is 0.007ms. Now trying different buffer sizes :
With a 16 frame buffer : latency = (48 + 62) / 48000 = 2.292 ms
jack_iodelay :
Latency Meter :

Again the difference is 0.007 ms.
Now with a 1024 frame buffer : latency = (3072 +62) /48000 = 65.292 ms.
jack_iodelay
Latency Meter :

The difference is 0.007 ms.
The measured values are consistently lower than the calculated values. I think that the reason for this is that I originally set JACK's latency compensation using a sample rate of 96kHz. Some of the hardware latency is due to a buffer, and that will result in a latency time that goes down as the sample rate goes up. Some of the hardware latency is a fixed time that doesn't vary with sample rate. So the amount of extra latency changes with sample rate, but it's only a couple of frames.
Using JACK in sync mode, 48kHz, 2 periods per buffer, 64 frames per period the theoretical value for this setup is :
nominal latency = ((1 + number of periods) * number of frames per period) / sample rate
nominal latency = ((1 + 2) * 64 ) / 48000 = 4ms.
For jack_iodelay :
Code: Select all
253.631 frames 5.284 ms total roundtrip latency
OK, so there's ~1.3ms extra latency. It's my hardware latency which I set earlier at 31 frames on the input and 31 frames on the output. Adding that in to the calculation :
latency = (192 + 62) / 48000 = 5.333 ... ms.
That's pretty accurate although I must say I was expecting the measured value to be higher than the calculated value. The difference between jack_iodelay and Latency Meter is 0.007ms. Now trying different buffer sizes :
With a 16 frame buffer : latency = (48 + 62) / 48000 = 2.292 ms
jack_iodelay :
Code: Select all
108.631 frames 2.263 ms total roundtrip latency
Again the difference is 0.007 ms.
Now with a 1024 frame buffer : latency = (3072 +62) /48000 = 65.292 ms.
jack_iodelay
Code: Select all
3132.631 frames 65.263 ms total roundtrip latency
The difference is 0.007 ms.
The measured values are consistently lower than the calculated values. I think that the reason for this is that I originally set JACK's latency compensation using a sample rate of 96kHz. Some of the hardware latency is due to a buffer, and that will result in a latency time that goes down as the sample rate goes up. Some of the hardware latency is a fixed time that doesn't vary with sample rate. So the amount of extra latency changes with sample rate, but it's only a couple of frames.