On Wed, Feb 16, 2005 at 06:25:15PM +0100, derek holzer wrote:
Hey Pix!
pix wrote:
why must we do deltime (msec) = (samplerate~ / Freq(Hz)) / 1000 instead?
Again, my oops, it was: deltime (msec) = (samplerate~ / Freq in Hz) / (samplerate~ / 1000)
you shouldn't have to do this, you would only need to involve samplerate~ if you were calculating the number of samples in the delay line. but you don't need to, you just give it milliseconds and vd~ sorts out the annoying stuff internally.
This assumes a sampling rate of 44100, however. Have a look at my last mail, where I revised the algorithm. I've used my patch at 44100, 48000 and 96000 Hz sampling rates and gotten consistant results, but I don't think that Davide's method would hold up outside the 44100 Hz range. But hey, I'm no Miller S. Puckette ;-)
(samplerate / freq) / (samplerate / 1000)
= (samplerate / freq) * (1000 / samplerate)
= (samplerate * 1000) / (freq * samplerate)
// (the two samplerates cancel)
= 1000 / freq
that's why it works on different samplerates.
the reason you might see karplus strong algorithms mentioning the samplerate is because all of the implementations i have seen actually use a table which is filled with noise to simulate a pluck. the table is then progressively smoothed while it is being played.
in that case, you need to know the samplerate to work out the size of the table. but with a delay line, you are just talking about length in terms of milliseconds, and it is up to the vd~ or delread~ object to work out the "physical" length of the buffer that it uses.
pix.