From: "Frank Barknecht" fbar@footils.org To: "post pd-msg" pd-list@iem.kug.ac.at
The shortest possible delay time then in theory is 1/nyquist[Hz] seconds.
= 2 samples?
Why? I'm talking about a signal in the digital domain. I don't want to get it digitized and analog-converted in this time. I'm looking for a solution to get a delay time between original and delayed signal shorter than 1 sample. I don't care if the delay of both signals has a value bigger than 1 sample.