On Sat, Mar 2, 2013 at 12:28 PM, ronni montoya <ronni.montoya@gmail.com> wrote:

In this case entropy varies with time and what i am interested in are
the entropy trayectories.

You can plot this trayectories and compare different trayectories from
different songs .

More complex sound structures should have more complex trayectories ,
not chaotic , not periodic but more complex . The problem for me  is
that  i need to plot or visualize the entropy trayectories (values) in
order to see the complexity of a sound structure.

It would be nice to find a way to automate , for example find a way of
measure different trayectories algorithmically and that computer can
tells automatically which one is more complex.

The subject I've been reading lately is basic computational neuroscience.  I'll explain what I think is a similar example.  Shannon entropy I think is connected to transmitting and processing information in neurons.

When you have two signals that are highly correlated, they have high mutual information.  Neurons in the peripheral nervous system transmit information and the neurons are specialized for transmitting information.  One PNS neuron's output should have high mutual information with its input neurons.

The kind of information is categorizing the trajectories.  In terms of its output, a neuron is either firing or not firing.  It's a lot like a binary variable so it sort of works like a digital signal.  The most information it can carry is bounded by the rate it switches between states.

In terms of signals--the trajectories of neurons are solutions of non-linear differential equations with lots of terms.  You break out the neuron voltage into state-space equations where each of the variables is an ion current or the axon hillock voltage.  In a neuron, some kinds of ions act quickly and inhibit, or two kinds of ions resonate when one is a fast exciter and the other is a slow inhibitor.  A neuron is firing when the trajectory in the phase plane passes around an unstable equilibrium.

Thing is, that's more of a synthesis problem.  You just go ahead and build your model generate signals and then you can calculate entropy because you built your model to switch between states.  The possible values of the system are constrained to the model--what do you do if you don't know what the possible values are?

I applied for grad school this spring down at FAU in the CCSBS.  Anything with BS in its name is great for me.  This time it stands for Brain Sciences :)