On Fri, May 14, 2004 at 12:35:17AM +0200, Roman Haefeli wrote:
hi
i had a discussion with a teacher today. the topic was the smallest delay-time possible. in his opinion one sample is the atom of signal and cannot be divided anymore. in my opinion it should be possible to get shorter delays than 1 samples with interpolation. my argument was: it should be possible to set the values of each sample so, that the resulting signal would be similar to a digitized analogue signal with a shorter than 1 sample delay.
supposing your interpolating delay wants to output something after 0.5 samples of time have passed...it will have to wait, its output being quantized by the overriding samplerate.. with PD's non-signal data - perhaps numbers specifying the delay time - you are not quantized by the samplerate, but by the 'block size' or aroudn 1.45 ms. there are ways around this involving some external libs, like iemlibs' t3 i think..
i've heard that certain programs like SynthEdit calculate all control data at the audio rate. to some extent you can already do this with pd by just using signal objects for everything, but it would be very cool if a coding genius takes up the challenge of redoing the m_* to support something like that so that control changes happen immediately, while also not bothering to calculate every object 44100 times a second if nothing has changed further up the chain...
Does anybody know a good explanation for this problem?
i konw, this haven't got anything to do with pd. but i think you are the right people to ask. by the way: if it is possible, how would a realization in pd look like?
thank you for helpin me
roman
PD-list mailing list PD-list@iem.at to manage your subscription (including un-subscription) see http://iem.at/cgi-bin/mailman/listinfo/pd-list