Hmm. OK. Then how can I measure sound card latency? My present understanding is that I run with the lowest number of frags at the smallest blocksize possible before I start hearing dropouts in the audio. I run with -noadc and don't get any difference in the lowest numbers I can get at. There must be more smarts to it than that. Can someone clue me as to a better process for getting these variables set right?
(jfm3)
Miller Puckette wrote:
Hi, one small comment... "blocksize" only controls the I/O block size, not the DSP block size, which is always 64. Control latency is always 64 samples but timing accuracy of I/O for control depends on I/O block size, since timing is all derived from ADC and/or DAC transfers which are made each I/O block.
jfm3 wrote:
Thanks for your reply.
I set -noadc (since I don't really use the audio in anyway, just prerecorded material). The lowest I could get it was -frags 8 -blocksize 64 [...] Then, at that point, removing -noadc made no difference.
Andrew (Andy) W. Schmeder wrote:
blocksize is the dsp vector size, also controlls the temporal resolution of messages... ie. your midi messages can only be processed on block boundaries. Midi having a clock with 1 ms resolution means you want a small blocksize to match. bs = 64 => ~1.4 ms resolution which is decent.
frags is latency... i.e. the number of blocks of lag. IMHO its better to have more frags than a large blocksize.
I have an i8x0 sound card in a laptop as well (tosh portege 3440ct) and with lowlat patch and alsa 0.9b. Basically the card sucks. In full duplex its impossible to to run with less than 15 ms latency... i.e. blocksize 64, 10 frags. However if I disable the dac or the adc (-nodac or -noadc) then I can get it down to 1.5-3 ms latency.... but usually that's not desirable.