Hallo Frank :-), thanks for your reply
(i updated the subject since the discussion moved on..)
While you could bang vsnapshot~ at samplerate and keep track of samples in a
list or so, this is a waste of resources. Something like tabsend~ or tabwrite~ probably is much better: Just write a number of samples into a table and then analyse that for peaks. There are some externals for that or do it manually.
As I understand it (and hopefully i am not completely wrong..) you don't have to 'store' a list and 'then' read again it to find the max magnitude. It's as simple as the following:
(some signal) *I* *I* [pd metro@samplerate] *I*/ [vsnapshot~ ] | [abs ] | [moses ]X[t f ] | [f ]
So in the last [f ] you have the max magnitude stored and you can bang it to the [vu ] inlet say at a fix rate (which is the simplest way, it just requires a [metro ] banging [f ] and resetting [moses ])
I don't understand how using an array to store samples and then analyse it is more efficient than that. It seems to me you have to read 2 times: one to write the array in the fist place and one more to analyse it.
a simple comparison:
read-analyse vs. read-write-read-analyse
Please correct me if i am missing something and post an example with the use of an array if you can.
alabala
On 2010-02-09 11:21, ypatios wrote:
Hallo Frank :-), thanks for your reply
(i updated the subject since the discussion moved on..)
While you could bang vsnapshot~ at samplerate and keep track of samples in a
list or so, this is a waste of resources. Something like tabsend~ or tabwrite~ probably is much better: Just write a number of samples into a table and then analyse that for peaks. There are some externals for that or do it manually.
As I understand it (and hopefully i am not completely wrong..) you don't have to 'store' a list and 'then' read again it to find the max magnitude. It's as simple as the following:
i haven't tested it, but you might be way more efficient when using [tabsend~] to write to an array and analyse it as frank suggested.
ah, and iemlib's [pvu~] has a rudimentary peak-detector built in, if you can live with externals.
fgm,adr IOhannes
On 2010-02-09 11:32, IOhannes m zmoelnig wrote:
i haven't tested it, but you might be way more efficient when using [tabsend~] to write to an array and analyse it as frank suggested.
as a matter of fact both perform crap (and while [env~] does something totally different (which requires more computation) it is way faster)
probably i did something wrong.
fgamsdr IOhannes
Hallo, ypatios hat gesagt: // ypatios wrote:
As I understand it (and hopefully i am not completely wrong..) you don't have to 'store' a list and 'then' read again it to find the max magnitude. It's as simple as the following:
(some signal) *I* *I* [pd metro@samplerate]
Note that [metro] has a lower limit of 1 ms, so you need to make your own metro-abstraction.
*I*/ [vsnapshot~ ] | [abs ] | [moses ]X[t f ] | [f ]
So in the last [f ] you have the max magnitude stored and you can bang it to the [vu ] inlet say at a fix rate (which is the simplest way, it just requires a [metro ] banging [f ] and resetting [moses ])
I don't understand how using an array to store samples and then analyse it is more efficient than that. It seems to me you have to read 2 times: one to write the array in the fist place and one more to analyse it.
Yes, but I assume that banging vsnapshot~ every sample is more expensive than just banging [tabwrite~] every 64 samples or less often. And table-access is cheap. However I did some tests now and it seems that the difference is not that big. I couldn't find a clear winner, so probably either solution can work for you. I didn't test the table-analysing externals from iem_tab, though, but did it "manually" using a counter and [max]x[f]
Frank
Note that [metro] has a lower limit of 1 ms, so you need to make your own metro-abstraction.
Ah - I had no idea that this was true. I suggested using a metro with a rate of 1/44.1 ms as an alternative to [block~ 1], but I guess there's no way around it...
It seems like the best thing to have for this case would be an extern that just reports the level of the peak sample in each block. Time resolution would be limited to the 64 sample block size, but there's probably not much need for this to be faster, right? And you could always reblock if you need more. Maybe you're looking for a purely vanilla solution to this, but if not, the extern would be quick to make if someone hasn't made it already. Let me know and I'd be happy to do it.
William
hello
im sorry i left the thread for some time now. thank you all very much for your replies.
Your results are confirmed here too: the two methods with vsnapshot~ and writing-reading to a table are equally inefficient. Matt's suggestion --whilst much more efficient-- has a serious disadvantage: it seems that there is no way of reseting [max~ ] without losing some samples (which could include a "peak"). Generally, i had many confusing problems while testing that one. Has anyone else tried it??
I give up for now due to the lack of time if anyone has another idea, please share! :-)
(By the way: there is another limitation i noticed in [bang~ ]: it won't go faster than 1 bang every 64 samples, even if the block size is set under that)
thanks again
alabala
Hallo,
I made an *almost* Pd vanilla version, that performs really well. It's "almost" because it requires an operation, that currently is an external, but Miller has expressed his intention to include it in one of the next versions here: http://lists.puredata.info/pipermail/pd-list/2009-08/071538.html
The external required in the attachment is called "tabwriteat~" ad it is the signal-equivalent to [tabwrite], i.e. it writes a number into a table at a position specified by a signal. This is immensly useful in many cases and sorely missing from Pd since a long time. Finding the peak of a signal is one possible application. tabwriteat~ is so useful because it add a little bit of recursion to in-block processing as you can store intermediate results into a table and read it again in the next sample of the same block.
For peak-finding the algorithm thenn becomes very simple: Just find the peak between two samples with [max~], store the maximum in a table with [tabwriteat~] and reuse the table's value in the next comparison.
Check out findpeak~.pd in the attachement, which also includes source and a linux binary of tabwriteat~. Hopefully Miller includes it soon under whatever name (poke, poke ...) :)
Frank
ypatios hat gesagt: // ypatios wrote:
hello
im sorry i left the thread for some time now. thank you all very much for your replies.
Your results are confirmed here too: the two methods with vsnapshot~ and writing-reading to a table are equally inefficient. Matt's suggestion --whilst much more efficient-- has a serious disadvantage: it seems that there is no way of reseting [max~ ] without losing some samples (which could include a "peak"). Generally, i had many confusing problems while testing that one. Has anyone else tried it??
I give up for now due to the lack of time if anyone has another idea, please share! :-)
(By the way: there is another limitation i noticed in [bang~ ]: it won't go faster than 1 bang every 64 samples, even if the block size is set under that)
thanks again
alabala
-- ypatios
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
On Fri, Feb 19, 2010 at 10:01:32AM +0100, Frank Barknecht wrote:
Check out findpeak~.pd in the attachement, which also includes source and a linux binary of tabwriteat~. Hopefully Miller includes it soon under whatever name (poke, poke ...) :)
Heck yeah! Tape delay emulation here we come. :)
Chris.