Dear list,
i have a question regarding timing in Pd:
I understand that messages to tilde objects just get passed to the DSP
tree within DSP blocks.
How about the reverse?
Found out that snapshot~ is returning the last sample of the last block
during which it got banged. This is fine, since it is the sample value
most closely to the output of the result.
Now when i use the following setup:
[bang~]
|
|
[t b b]
| |
| |
[timer]
i get a minimum logical time of 1.45 msec (aquivalent to 64samples at
44.1 kHz) even when i use a blocksize of [block~ 32]. That makes me
assume that the "sampling period" for an external process (such as an
audio clock, represented by bang~) is limited to 64 samples.
What is also interesting, is that i get the double value of 2.9 msec for
blocksizes above 64, for example 65 and above. There are no adc~ or dac~
objects or any subpatches in this setup.
So, with predefined timing objects (such as [del]), the resolution of
the logical time in [timer] can be as low as a fraction of a
millisecond, while non-deterministic events get sampled every 64 DSP
ticks (or probably integer mutiples).
All still confusing somehow, looking forwards to any answers!
regards,
Peter