hi
i had a discussion with a teacher today. the topic was the smallest delay-time possible. in his opinion one sample is the atom of signal and cannot be divided anymore. in my opinion it should be possible to get shorter delays than 1 samples with interpolation. my argument was: it should be possible to set the values of each sample so, that the resulting signal would be similar to a digitized analogue signal with a shorter than 1 sample delay.
Does anybody know a good explanation for this problem?
i konw, this haven't got anything to do with pd. but i think you are the right people to ask. by the way: if it is possible, how would a realization in pd look like?
thank you for helpin me
roman