Hi , i was wondering if anybody have implemented the shannon entropy function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.comwrote:
Hi , i was wondering if anybody have implemented the shannon entropy function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
Hi, why is not possible? Instead of analysing the real time value of the signal , maybe i can have a memory or buffer that store the a piece of signal ( groups of samples) from time to time and then analize that group of values.
Maybe it can convert that group of values into a string and then:
http://www.shannonentropy.netmark.pl/calculate
Other idea : ive seen using shannon entropy for calculating complexity in terms of spatial configuration.
Maybe other option could be converting my signal into image for example using similarity matrix and then analyze that image to get entropy values.
cheers
R
2013/2/26, Charles Z Henry czhenry@gmail.com:
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.comwrote:
Hi , i was wondering if anybody have implemented the shannon entropy function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
Why not do an FFT and measure the variance of the channels? For instance white noise has maximum entropy and all the bins of its FFT will be more or less the same, while a sine wave has low entropy and one bin will be much larger than the others.
Martin
On 2013-02-27 08:40, ronni montoya wrote:
Hi, why is not possible? Instead of analysing the real time value of the signal , maybe i can have a memory or buffer that store the a piece of signal ( groups of samples) from time to time and then analize that group of values.
Maybe it can convert that group of values into a string and then:
http://www.shannonentropy.netmark.pl/calculate
Other idea : ive seen using shannon entropy for calculating complexity in terms of spatial configuration.
Maybe other option could be converting my signal into image for example using similarity matrix and then analyze that image to get entropy values.
cheers
R
2013/2/26, Charles Z Henry czhenry@gmail.com:
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.comwrote:
Hi , i was wondering if anybody have implemented the shannon entropy function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
If you took the fft squared magnitude, perfectly noisy data should have a chi-squared distribution in each bin (I think). If you assumed that model and calculated the parameters of the distribution on each block, you'd find out how much information is in each of those peaks relative to the assumed distribution and just add it up.
What ever algorithm you choose probably needs to pass some "common sense" tests like what you mention Martin, noise has more entropy than a sine wave. Also, if you take noise and just apply a comparison > 0, you get a signal with less entropy.
On Wed, Feb 27, 2013 at 7:54 AM, Martin Peach martin.peach@sympatico.cawrote:
Why not do an FFT and measure the variance of the channels? For instance white noise has maximum entropy and all the bins of its FFT will be more or less the same, while a sine wave has low entropy and one bin will be much larger than the others.
Martin
On 2013-02-27 08:40, ronni montoya wrote:
Hi, why is not possible? Instead of analysing the real time value of the signal , maybe i can have a memory or buffer that store the a piece of signal ( groups of samples) from time to time and then analize that group of values.
Maybe it can convert that group of values into a string and then:
http://www.shannonentropy.**netmark.pl/calculatehttp://www.shannonentropy.netmark.pl/calculate
Other idea : ive seen using shannon entropy for calculating complexity in terms of spatial configuration.
Maybe other option could be converting my signal into image for example using similarity matrix and then analyze that image to get entropy values.
cheers
R
2013/2/26, Charles Z Henry czhenry@gmail.com:
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.com**wrote:
Hi , i was wondering if anybody have implemented the shannon entropy
function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
______________________________**_________________ Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/**listinfo/pd-listhttp://lists.puredata.info/listinfo/pd-list
______________________________**_________________ Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/** listinfo/pd-list http://lists.puredata.info/listinfo/pd-list
Hi, Charles, my idea in using shannons entropy is to measure self generated songs.
For example if you have a patch that generate sound structures using a generative rules it would be nice to measure that sound structure and use that measurement to evolve the rules that generate that sound structure in order to create more complex structures for example.
But how to measure a sound structure using shannons entropy?
I was experimenting taking only short pieces of a larger sound , converting each piece into a string and evaluate the shannon entropy of each string.
In this case entropy varies with time and what i am interested in are the entropy trayectories.
You can plot this trayectories and compare different trayectories from different songs .
More complex sound structures should have more complex trayectories , not chaotic , not periodic but more complex . The problem for me is that i need to plot or visualize the entropy trayectories (values) in order to see the complexity of a sound structure.
It would be nice to find a way to automate , for example find a way of measure different trayectories algorithmically and that computer can tells automatically which one is more complex.
Do you have an idea?
I have a question, why do you say it would be meaning less to convert signal into symbols?
Other way i was experimenting is using this with video and images, for example converting an image into a array of characters iterating over all the pixels an getting the color of each pixel , then converting those values into characters and then evaluating the shannons entropy of each image.
I would like to expand this and use it also for self generated 3d structure, but im still thinking about this.
cheers.
R.
can you please explain me why do you say it would be meaningless?
"That would do something, but may be meaningless--It would be just one way of converting the signal from real numbers to a discrete set of things/symbols that is easier to calculate."
2013/2/27, Charles Z Henry czhenry@gmail.com:
If you took the fft squared magnitude, perfectly noisy data should have a chi-squared distribution in each bin (I think). If you assumed that model and calculated the parameters of the distribution on each block, you'd find out how much information is in each of those peaks relative to the assumed distribution and just add it up.
What ever algorithm you choose probably needs to pass some "common sense" tests like what you mention Martin, noise has more entropy than a sine wave. Also, if you take noise and just apply a comparison > 0, you get a signal with less entropy.
On Wed, Feb 27, 2013 at 7:54 AM, Martin Peach martin.peach@sympatico.cawrote:
Why not do an FFT and measure the variance of the channels? For instance white noise has maximum entropy and all the bins of its FFT will be more or less the same, while a sine wave has low entropy and one bin will be much larger than the others.
Martin
On 2013-02-27 08:40, ronni montoya wrote:
Hi, why is not possible? Instead of analysing the real time value of the signal , maybe i can have a memory or buffer that store the a piece of signal ( groups of samples) from time to time and then analize that group of values.
Maybe it can convert that group of values into a string and then:
http://www.shannonentropy.**netmark.pl/calculatehttp://www.shannonentropy.netmark.pl/calculate
Other idea : ive seen using shannon entropy for calculating complexity in terms of spatial configuration.
Maybe other option could be converting my signal into image for example using similarity matrix and then analyze that image to get entropy values.
cheers
R
2013/2/26, Charles Z Henry czhenry@gmail.com:
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.com**wrote:
Hi , i was wondering if anybody have implemented the shannon entropy
function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
______________________________**_________________ Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/**listinfo/pd-listhttp://lists.puredata.info/listinfo/pd-list
______________________________**_________________ Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/** listinfo/pd-list http://lists.puredata.info/listinfo/pd-list
On Sat, Mar 2, 2013 at 12:28 PM, ronni montoya ronni.montoya@gmail.comwrote:
Hi, Charles, my idea in using shannons entropy is to measure self generated songs.
For example if you have a patch that generate sound structures using a generative rules it would be nice to measure that sound structure and use that measurement to evolve the rules that generate that sound structure in order to create more complex structures for example.
Cool! That's a great idea!
But how to measure a sound structure using shannons entropy?
I guess I'm interested because it's a really tricky problem to define. There's no clear mathematical formula to apply. I'm happy to discuss how you might do it, but I don't know if it's been done correctly already--or if there's some articles about entropy definitions for signals.
The important thing is if it captures the properties of the signal you care about. If you have no math to start from--describe it verbally first.
I was experimenting taking only short pieces of a larger sound , converting each piece into a string and evaluate the shannon entropy of each string.
In this case entropy varies with time and what i am interested in are the entropy trayectories.
You can plot this trayectories and compare different trayectories from different songs .
More complex sound structures should have more complex trayectories , not chaotic , not periodic but more complex . The problem for me is that i need to plot or visualize the entropy trayectories (values) in order to see the complexity of a sound structure.
It would be nice to find a way to automate , for example find a way of measure different trayectories algorithmically and that computer can tells automatically which one is more complex.
Do you have an idea?
Martin's suggestion about spectral distribution is good. Autocorrelation might also have some good properties--the signal has less entropy when it is more self-similar. This also starts to sound like fractal dimension, which can be calculated by a box-muller method.
I have a question, why do you say it would be meaning less to convert signal into symbols?
It may be meaningless if you choose a bad rule to convert them into symbols. Here's an example meaningless rule: Convert ranges of signal values into discrete values: -1 to -0.99 -> -99 -0.99 to -0.98 -> 98 ... -0.01 to 0 -> 0 0 to 0.01 -> 1 ...
Then, if you had a signal and you multiplied it by 10, the entropy measured from the discrete values would increase. However--this does not mean the signal has more information. It just becomes louder.
If you decide to convert the signal into symbols, it has to be a meaningful rule. Otherwise, you might not be measuring the thing you meant to.
Other way i was experimenting is using this with video and images, for example converting an image into a array of characters iterating over all the pixels an getting the color of each pixel , then converting those values into characters and then evaluating the shannons entropy of each image.
I would like to expand this and use it also for self generated 3d structure, but im still thinking about this.
cheers.
R.
can you please explain me why do you say it would be meaningless?
"That would do something, but may be meaningless--It would be just one way of converting the signal from real numbers to a discrete set of things/symbols that is easier to calculate."
2013/2/27, Charles Z Henry czhenry@gmail.com:
If you took the fft squared magnitude, perfectly noisy data should have a chi-squared distribution in each bin (I think). If you assumed that
model
and calculated the parameters of the distribution on each block, you'd
find
out how much information is in each of those peaks relative to the
assumed
distribution and just add it up.
What ever algorithm you choose probably needs to pass some "common sense" tests like what you mention Martin, noise has more entropy than a sine wave. Also, if you take noise and just apply a comparison > 0, you get a signal with less entropy.
On Wed, Feb 27, 2013 at 7:54 AM, Martin Peach martin.peach@sympatico.cawrote:
Why not do an FFT and measure the variance of the channels? For instance white noise has maximum entropy and all the bins of its FFT will be more or less the same, while a sine wave has low entropy and one bin will be much larger than the others.
Martin
On 2013-02-27 08:40, ronni montoya wrote:
Hi, why is not possible? Instead of analysing the real time value of the signal , maybe i can have a memory or buffer that store the a piece of signal ( groups of samples) from time to time and then analize that group of values.
Maybe it can convert that group of values into a string and then:
http://www.shannonentropy.netmark.pl/calculate%3E
Other idea : ive seen using shannon entropy for calculating complexity in terms of spatial configuration.
Maybe other option could be converting my signal into image for example using similarity matrix and then analyze that image to get entropy values.
cheers
R
2013/2/26, Charles Z Henry czhenry@gmail.com:
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in
a
observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98
and
1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work. The information in a single observation diverges. So, doing that with floating point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know
the
maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.com**wrote:
Hi , i was wondering if anybody have implemented the shannon entropy
function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
______________________________**_________________ Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/**listinfo/pd-list<
http://lists.puredata.info/listinfo/pd-list%3E
______________________________**_________________ Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/** listinfo/pd-list http://lists.puredata.info/listinfo/pd-list
On Sat, Mar 2, 2013 at 12:28 PM, ronni montoya ronni.montoya@gmail.comwrote:
In this case entropy varies with time and what i am interested in are the entropy trayectories.
You can plot this trayectories and compare different trayectories from different songs .
More complex sound structures should have more complex trayectories , not chaotic , not periodic but more complex . The problem for me is that i need to plot or visualize the entropy trayectories (values) in order to see the complexity of a sound structure.
It would be nice to find a way to automate , for example find a way of measure different trayectories algorithmically and that computer can tells automatically which one is more complex.
The subject I've been reading lately is basic computational neuroscience. I'll explain what I think is a similar example. Shannon entropy I think is connected to transmitting and processing information in neurons.
When you have two signals that are highly correlated, they have high mutual information. Neurons in the peripheral nervous system transmit information and the neurons are specialized for transmitting information. One PNS neuron's output should have high mutual information with its input neurons.
The kind of information is categorizing the trajectories. In terms of its output, a neuron is either firing or not firing. It's a lot like a binary variable so it sort of works like a digital signal. The most information it can carry is bounded by the rate it switches between states.
In terms of signals--the trajectories of neurons are solutions of non-linear differential equations with lots of terms. You break out the neuron voltage into state-space equations where each of the variables is an ion current or the axon hillock voltage. In a neuron, some kinds of ions act quickly and inhibit, or two kinds of ions resonate when one is a fast exciter and the other is a slow inhibitor. A neuron is firing when the trajectory in the phase plane passes around an unstable equilibrium.
Thing is, that's more of a synthesis problem. You just go ahead and build your model generate signals and then you can calculate entropy because you built your model to switch between states. The possible values of the system are constrained to the model--what do you do if you don't know what the possible values are?
I applied for grad school this spring down at FAU in the CCSBS. Anything with BS in its name is great for me. This time it stands for Brain Sciences :)
On Wed, Feb 27, 2013 at 7:40 AM, ronni montoya ronni.montoya@gmail.comwrote:
Hi, why is not possible?
What I mean is using floating point numbers, as an approximation of real numbers. We have a finite number of samples, so it's impossible to work with continuous distributions, except by approximation. However--brainstorming a few methods of approximation is good. I'm not particularly an expert on the subject of entropy, but I enjoy it.
Instead of analysing the real time value of the signal , maybe i can have a memory or buffer that store the a piece of signal ( groups of samples) from time to time and then analize that group of values.
If you're analyzing only pieces you might wonder if the signals behave the same all the time. There are many "bursting" phenomena that are interesting. Those kinds of signals have long-term correlations that have lower entropy--but any small segment does not capture the behavior.
Maybe it can convert that group of values into a string and then:
That would do something, but may be meaningless--It would be just one way of converting the signal from real numbers to a discrete set of things/symbols that is easier to calculate.
Since you brought up the topic---I was reading on wikipedia about how shannon entropy is used to obtain lower bounds on compression ratios. There are some types of audio compression--could you find a connection there?
Other idea : ive seen using shannon entropy for calculating complexity in terms of spatial configuration.
Maybe other option could be converting my signal into image for example using similarity matrix and then analyze that image to get entropy values.
cheers
R
2013/2/26, Charles Z Henry czhenry@gmail.com:
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1 and it's quantized in 0.02 increments (to make the math easier), then the information of any value observed is: -0.01*log(0.01)
Then--if I had a signal that's N samples long, I have N times as much information. Or perhaps think of it as a rate of information.
But for real numbers and continuous distributions, this doesn't work.
The
information in a single observation diverges. So, doing that with
floating
point numbers is not practical.
You often see Shannon entropy describing digital signals. If the signal just switches between 0 and 1, we can generate a distribution of the data and see what the probability is empirically. The entropy of each new sample is relative to the distribution. Likewise, then if you know the maximum rate of switching, you can figure out the maximum rate of information in the signal.
Just a few thoughts...
Chuck
On Tue, Feb 26, 2013 at 6:09 AM, ronni montoya ronni.montoya@gmail.comwrote:
Hi , i was wondering if anybody have implemented the shannon entropy function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list