A delay in general is a type of filter, an "all-pass" filter.
here is one of the first hits I got from google that would be relevant to your problem:
http://www.owlnet.rice.edu/~elec431/projects97/Phhh/431paper.html
hmm, quickly thinking about it (and moving the discussion to the [OT] list)...
On Fri, May 14, 2004 at 12:35:17AM +0200, Roman Haefeli wrote:
hi
i had a discussion with a teacher today. the topic was the smallest delay-time possible. in his opinion one sample is the atom of signal and cannot be divided anymore. in my opinion it should be possible to get shorter delays than 1 samples with interpolation. my argument was: it
should
be possible to set the values of each sample so, that the resulting signal would be similar to a digitized analogue signal with a shorter than 1
sample
delay.
Does anybody know a good explanation for this problem?
i konw, this haven't got anything to do with pd. but i think you are the right people to ask. by the way: if it is possible, how would a
realization
in pd look like?
thank you for helpin me
roman
PD-list mailing list PD-list@iem.at to manage your subscription (including un-subscription) see http://iem.at/cgi-bin/mailman/listinfo/pd-list
PD-ot mailing list PD-ot@iem.at http://iem.at/cgi-bin/mailman/listinfo/pd-ot