hmm, quickly thinking about it (and moving the discussion to the [OT] list)...
since you can't actually generate output out of sample boundaries, even though you might be able to interpolate and find out a delay value less than one sample, you will only be about to output it during the next sample. but latency is always going to be much more than one sample anyhow, so i guess i can assume you don't care about this.
the other thing is that you only have two samples to play with, the current sample and the last sample. so the best you can do is linearly interpolate, which doesn't sound so great. (well you theoretically have access to all of the previous samples, but normally to do interpolation you need points symetrically before and after the point you want to interpolate).
if C is the current sample and L is the previous sample, you could linearly interpolate between them with something like C*t+L*(1-t). where t is between 0,1 and is the fractional number of samples you want to delay. you can get L in pd using the zexy "z" object.
if you are less concerned about realtime, you could keep a buffer of the last few input samples and do better interpolation. your final output will be delayed, but the relative delay between your input signal and your interpolated delay will be less than one sample.
btw, i'm no academic, don't go using this as gospel to shout down your teacher. but feel free to use it as inspiration ;)
pix.
On Fri, May 14, 2004 at 12:35:17AM +0200, Roman Haefeli wrote:
hi
i had a discussion with a teacher today. the topic was the smallest delay-time possible. in his opinion one sample is the atom of signal and cannot be divided anymore. in my opinion it should be possible to get shorter delays than 1 samples with interpolation. my argument was: it should be possible to set the values of each sample so, that the resulting signal would be similar to a digitized analogue signal with a shorter than 1 sample delay.
Does anybody know a good explanation for this problem?
i konw, this haven't got anything to do with pd. but i think you are the right people to ask. by the way: if it is possible, how would a realization in pd look like?
thank you for helpin me
roman
PD-list mailing list PD-list@iem.at to manage your subscription (including un-subscription) see http://iem.at/cgi-bin/mailman/listinfo/pd-list
A delay in general is a type of filter, an "all-pass" filter.
here is one of the first hits I got from google that would be relevant to your problem:
http://www.owlnet.rice.edu/~elec431/projects97/Phhh/431paper.html
hmm, quickly thinking about it (and moving the discussion to the [OT] list)...
On Fri, May 14, 2004 at 12:35:17AM +0200, Roman Haefeli wrote:
hi
i had a discussion with a teacher today. the topic was the smallest delay-time possible. in his opinion one sample is the atom of signal and cannot be divided anymore. in my opinion it should be possible to get shorter delays than 1 samples with interpolation. my argument was: it
should
be possible to set the values of each sample so, that the resulting signal would be similar to a digitized analogue signal with a shorter than 1
sample
delay.
Does anybody know a good explanation for this problem?
i konw, this haven't got anything to do with pd. but i think you are the right people to ask. by the way: if it is possible, how would a
realization
in pd look like?
thank you for helpin me
roman
PD-list mailing list PD-list@iem.at to manage your subscription (including un-subscription) see http://iem.at/cgi-bin/mailman/listinfo/pd-list
PD-ot mailing list PD-ot@iem.at http://iem.at/cgi-bin/mailman/listinfo/pd-ot