>
Order forcing works well for me. Just set the [delwrite~] to 10 (
but weirdness arises from 0 length delay.
> If a delay of zero WAS actually permitted, these would form infinite loops.
Don't think so, depends on what a 0 delay is. If it is "no delay", and by that I mean "No Buffer", then it only outputs 0 values, right?. In other words, nothing happens, no infinite loop, nothing, just zeros...
but if instead of zeros it does have some buffer length in it, then a feedback delay will always de delayed at least one block size, so no worries about feedback loop.
Actually, you need to set delread to "0" for a minimum block size delay. Perhaps you meant you couldn't or shouldn't put a "0" delay time in the delread~ object for feedback, but actually you NEED to do that.
Thing is that I just use delwrite~ and delread~ with 0 length arguments for both and a block size of 1 to allow single sample feedback. I do it cause I wanted the minimum delay buffer size as possible and I didn't want to write in tiny and long and boring numbers according to one sample size depending on sample rate.
Since it was working, I had just always assumed it would create a buffer of one block.
This is not what's really happening as I see it.
I don't really care that much on what happens, doesn't seem like a big deal, but it was nice to understand this behaviour. It doesn't seem very consistent, that's all I can say...
Now, what it actually does is really just a matter of design choices. It could very much just create no delay buffer at all, where you'd get 0 values perhaps, like I imagined. That's silly anyway...
Or... it could be only one sample... or one block... I had assumed out of nowhere that it could be a block size, but it could much be just a single sample, which seems to make sense and it'd be cool I guess.
What's really bad is that you need to always put a value that is at least one block size. It's a bug considering the documentation clearly stated that the design was really supposed to be a delay between 0 and max delay size, but one way or another, it's really annoying doing all this math as a workaround, when it's just a matter of coding it properly to allow any size greater than 0 and smaller than a block size (in orther words, to fix it).
> If you, however, want a simple block delay in a feedback loop,
> just use a pair of [send~] and [receive~].
don't work for block size < 64
> specifying the buffer size makes much more
> sense then giving a maximum delay time
Those two things means the same to me, where maximum delay time = buffer size. I don't get this.
> [delwrite~] object would need to keep track of this
sure, whatever, why not?
by the way, that's the one that defines the max delay length (or buffer size), (and there can be only one, by the way) - so it only needs to keep track of its block size to work out the proper buffer size.
I might see an issue if delread~ is in a subpatch that has a longer block size, but I don't wee why anyone would need that, and perhaps just say you shouldn't do it.
I think we've discussed this before, perhaps just make sure both are in the same block size. I for one, never needed them to be in different block sizes, makes no useful sense.
But anyway, I guess Miller is the one that should hop in and share his thoughts.
and lets not forget the "clear" method, also important :)
cheers