On Sun, 2011-01-09 at 20:26 -0500, Mathieu Bouchard wrote:
On Mon, 10 Jan 2011, ~E. wrote:
I'm searching how i can detect the change in the compression of an audio signal. The purpose is to detect (and quantified) the compression changes between the music and the ads in a radioshow. Have any ideas ?
If you don't have the original uncompressed recordings, I don't see how you could be doing that. You'd have to guess how complex sounds are supposed to fade out normally, to find out how much the fade out has been messed with.
And then, in the compressor, you have both a measurement of input volume and a formula for turning that input volume into a gain to be applied, and both of those parts are subject to a lot of variation and tweaking.
Assuming that the more compression is applied, the more the RMS amplitude [1] approaches the Peak amplitude [2] of an audio signal, you could measure the two and probably get a raw grasp how much compression was applied. This is simply an idea for which I don't have any reference that it is really working.
I could imagine that recordings of certain sets of natural instruments show always a similar relation between peak and RMS amplitude for that set. However, usually there is already some compression applied when releasing the recording which makes it hard to distinct compression applied in the radio station from the compression shipped with the recording. I also could imagine, that it's much harder to find applicable rules for synthesized sounds.
Roman
[1] http://en.wikipedia.org/wiki/Amplitude#Root_mean_square_amplitude [2] http://en.wikipedia.org/wiki/Amplitude#Peak_amplitude