Hallo, ypatios hat gesagt: // ypatios wrote:
As I understand it (and hopefully i am not completely wrong..) you don't have to 'store' a list and 'then' read again it to find the max magnitude. It's as simple as the following:
(some signal) *I* *I* [pd metro@samplerate]
Note that [metro] has a lower limit of 1 ms, so you need to make your own metro-abstraction.
*I*/ [vsnapshot~ ] | [abs ] | [moses ]X[t f ] | [f ]
So in the last [f ] you have the max magnitude stored and you can bang it to the [vu ] inlet say at a fix rate (which is the simplest way, it just requires a [metro ] banging [f ] and resetting [moses ])
I don't understand how using an array to store samples and then analyse it is more efficient than that. It seems to me you have to read 2 times: one to write the array in the fist place and one more to analyse it.
Yes, but I assume that banging vsnapshot~ every sample is more expensive than just banging [tabwrite~] every 64 samples or less often. And table-access is cheap. However I did some tests now and it seems that the difference is not that big. I couldn't find a clear winner, so probably either solution can work for you. I didn't test the table-analysing externals from iem_tab, though, but did it "manually" using a counter and [max]x[f]
Frank